Apr 16 19:51:19.137745 ip-10-0-131-77 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 19:51:19.137757 ip-10-0-131-77 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 19:51:19.137765 ip-10-0-131-77 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 19:51:19.137982 ip-10-0-131-77 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 19:51:29.280941 ip-10-0-131-77 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 19:51:29.280958 ip-10-0-131-77 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 501c1aed221940d4b5d911db0cce5202 -- Apr 16 19:53:53.548425 ip-10-0-131-77 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:53.976068 ip-10-0-131-77 kubenswrapper[2560]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:53.976068 ip-10-0-131-77 kubenswrapper[2560]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:53.976068 ip-10-0-131-77 kubenswrapper[2560]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:53.976068 ip-10-0-131-77 kubenswrapper[2560]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:53.976068 ip-10-0-131-77 kubenswrapper[2560]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:53.977547 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.977460 2560 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:53.982442 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982427 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.982442 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982443 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982446 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982450 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982453 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982456 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982459 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982462 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982464 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982467 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982470 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982473 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982476 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982479 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982481 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982484 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982487 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982489 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982493 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982496 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.982500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982499 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982502 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982505 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982508 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982510 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982513 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982516 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982519 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982521 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982523 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982526 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982529 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982531 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982534 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982536 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982538 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982541 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982544 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982546 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982548 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.982960 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982551 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982554 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982556 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982559 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982561 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982564 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982566 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982569 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982573 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982575 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982578 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982581 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982583 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982586 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982589 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982592 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982594 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982597 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982601 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982605 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.983473 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982607 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982610 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982613 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982616 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982619 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982622 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982624 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982627 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982629 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982632 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982635 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982638 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982640 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982643 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982646 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982649 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982652 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982654 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982657 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.983963 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982659 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982663 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982666 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982669 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982672 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982675 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.982677 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983055 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983060 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983063 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983066 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983070 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983072 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983075 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983078 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983080 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983083 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983086 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983088 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983091 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.984454 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983093 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983096 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983098 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983101 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983103 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983120 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983123 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983125 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983128 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983131 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983134 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983137 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983140 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983142 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983145 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983148 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983150 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983153 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983156 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983159 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.984941 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983162 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983164 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983167 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983170 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983173 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983175 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983178 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983181 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983183 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983186 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983188 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983191 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983193 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983196 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983199 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983201 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983204 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983206 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983209 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.985477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983212 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983214 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983218 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983220 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983223 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983226 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983230 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983234 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983237 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983241 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983244 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983247 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983250 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983253 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983255 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983258 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983261 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983263 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983266 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.985932 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983268 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983271 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983274 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983276 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983279 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983281 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983284 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983286 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983289 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983291 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983294 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983296 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983300 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983303 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.983305 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984556 2560 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984566 2560 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984573 2560 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984578 2560 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984583 2560 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984587 2560 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:53.986450 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984591 2560 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984596 2560 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984599 2560 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984602 2560 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984608 2560 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984611 2560 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984615 2560 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984618 2560 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984620 2560 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984623 2560 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984626 2560 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984629 2560 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984632 2560 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984636 2560 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984639 2560 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984642 2560 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984645 2560 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984648 2560 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984652 2560 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984655 2560 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984658 2560 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984661 2560 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984664 2560 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984667 2560 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:53.986957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984671 2560 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984674 2560 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984677 2560 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984681 2560 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984684 2560 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984687 2560 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984690 2560 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984693 2560 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984696 2560 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984701 2560 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984704 2560 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984706 2560 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984710 2560 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984720 2560 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984728 2560 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984731 2560 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984734 2560 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984737 2560 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984740 2560 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984743 2560 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984746 2560 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984749 2560 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984752 2560 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984754 2560 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984757 2560 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:53.987550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984761 2560 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984764 2560 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984767 2560 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984771 2560 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984774 2560 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984777 2560 flags.go:64] FLAG: --help="false" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984780 2560 flags.go:64] FLAG: --hostname-override="ip-10-0-131-77.ec2.internal" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984783 2560 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984787 2560 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984790 2560 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984793 2560 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984797 2560 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984800 2560 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984802 2560 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984805 2560 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984808 2560 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984811 2560 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984814 2560 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984817 2560 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984820 2560 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984823 2560 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984827 2560 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984830 2560 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984833 2560 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:53.988178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984836 2560 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984839 2560 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984842 2560 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984847 2560 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984850 2560 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984852 2560 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984855 2560 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984858 2560 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984862 2560 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984865 2560 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984868 2560 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984872 2560 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984876 2560 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984880 2560 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984883 2560 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984886 2560 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984889 2560 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984892 2560 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984895 2560 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984898 2560 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984901 2560 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984908 2560 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984912 2560 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984914 2560 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:53.988767 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984917 2560 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984920 2560 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984926 2560 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984928 2560 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984932 2560 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984935 2560 flags.go:64] FLAG: --port="10250" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984938 2560 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984941 2560 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f5402a453a762114" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984945 2560 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984948 2560 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984951 2560 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984954 2560 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984957 2560 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984960 2560 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984963 2560 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984966 2560 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984970 2560 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984974 2560 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984977 2560 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984980 2560 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984982 2560 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984985 2560 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984988 2560 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984992 2560 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984995 2560 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:53.989354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.984998 2560 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985000 2560 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985004 2560 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985007 2560 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985010 2560 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985013 2560 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985016 2560 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985019 2560 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985021 2560 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985024 2560 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985027 2560 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985030 2560 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985035 2560 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985038 2560 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985041 2560 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985045 2560 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985048 2560 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985051 2560 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985053 2560 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985056 2560 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985059 2560 flags.go:64] FLAG: --v="2" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985064 2560 flags.go:64] FLAG: --version="false" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985068 2560 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985072 2560 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985075 2560 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:53.989934 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985177 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985182 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985185 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985187 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985190 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985197 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985200 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985203 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985206 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985208 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985211 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985215 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985219 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985221 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985224 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985227 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985229 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985232 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985235 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985237 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.990630 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985240 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985243 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985247 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985249 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985252 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985254 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985257 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985259 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985262 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985264 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985267 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985270 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985272 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985275 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985277 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985280 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985283 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985287 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985290 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985292 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.991458 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985295 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985298 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985300 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985303 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985306 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985308 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985312 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985316 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985319 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985322 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985325 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985328 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985330 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985333 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985335 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985339 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985342 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985344 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985347 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.992122 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985350 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985352 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985355 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985357 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985360 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985363 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985365 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985368 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985370 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985373 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985376 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985379 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985382 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985384 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985387 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985390 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985393 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985395 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985398 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.992649 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985401 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.993208 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985404 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.993208 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985406 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.993208 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985409 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.993208 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985411 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.993208 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985414 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.993208 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985417 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.993208 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.985419 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.993208 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.985427 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:53.993980 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.993960 2560 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:53.994015 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.993981 2560 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:53.994049 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994031 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.994049 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994037 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.994049 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994040 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.994049 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994043 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.994049 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994046 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.994049 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994049 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.994049 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994052 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994055 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994058 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994061 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994064 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994066 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994069 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994072 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994075 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994078 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994080 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994083 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994085 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994088 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994091 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994094 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994096 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994099 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994101 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994104 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.994378 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994121 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994126 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994129 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994133 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994137 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994141 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994145 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994149 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994152 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994155 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994157 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994160 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994164 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994166 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994169 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994171 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994174 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994176 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994179 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.994879 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994181 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994185 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994189 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994192 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994194 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994197 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994200 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994202 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994205 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994207 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994210 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994212 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994215 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994218 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994221 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994223 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994226 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994228 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994230 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994234 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.995369 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994236 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994239 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994241 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994244 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994246 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994249 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994252 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994254 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994257 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994259 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994263 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994268 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994271 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994275 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994279 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994283 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994287 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994291 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994293 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994296 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.995893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994298 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.994304 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994406 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994413 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994416 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994418 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994421 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994424 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994427 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994432 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994436 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994440 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994443 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994446 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994448 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994451 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.996405 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994454 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994457 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994459 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994462 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994464 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994467 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994469 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994472 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994476 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994478 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994481 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994483 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994486 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994489 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994491 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994493 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994496 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994498 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994501 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994504 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.996833 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994507 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994512 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994516 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994521 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994525 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994528 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994530 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994533 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994536 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994538 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994541 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994543 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994546 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994548 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994551 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994553 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994556 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994558 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994561 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994564 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.997333 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994566 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994570 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994573 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994576 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994578 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994580 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994583 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994585 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994588 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994590 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994593 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994595 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994598 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994600 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994603 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994608 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994612 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994616 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994618 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.997816 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994621 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994625 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994628 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994631 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994634 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994636 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994639 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994642 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994645 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994647 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994650 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994652 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:53.994655 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.994660 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:53.998305 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.995355 2560 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:53.998655 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.997391 2560 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:53.998655 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.998543 2560 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:53.998655 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.998642 2560 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:53.998736 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:53.998678 2560 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:54.021912 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.021892 2560 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:54.024971 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.024956 2560 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:54.039583 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.039561 2560 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:54.044812 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.044796 2560 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:54.046058 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.046041 2560 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:54.050220 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.050198 2560 fs.go:135] Filesystem UUIDs: map[146378e4-083d-468f-9085-babc941ace4d:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 97a062a6-ec2f-4e27-96fe-539556718504:/dev/nvme0n1p4] Apr 16 19:53:54.050307 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.050220 2560 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:54.053261 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.053241 2560 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:54.055958 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.055848 2560 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:54.054044101 +0000 UTC m=+0.387425413 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100115 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec233c7a482e83cd4c979d514911e027 SystemUUID:ec233c7a-482e-83cd-4c97-9d514911e027 BootID:501c1aed-2219-40d4-b5d9-11db0cce5202 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:be:b4:f9:61:f7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:be:b4:f9:61:f7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:c4:7a:b8:de:83 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:54.055958 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.055947 2560 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:54.056131 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.056056 2560 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:54.057029 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.056999 2560 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:54.057211 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.057031 2560 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-77.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:54.057300 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.057225 2560 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:54.057300 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.057237 2560 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:54.057300 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.057261 2560 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:54.058037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.058025 2560 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:54.059263 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.059251 2560 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:54.059388 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.059378 2560 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:54.061542 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.061531 2560 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:54.061597 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.061548 2560 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:54.061597 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.061564 2560 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:54.061597 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.061579 2560 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:54.061597 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.061591 2560 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:54.062680 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.062667 2560 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:54.062745 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.062690 2560 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:54.066832 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.066810 2560 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:54.068261 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.068242 2560 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:54.070139 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070123 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:54.070214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070148 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:54.070214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070156 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:54.070214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070162 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:54.070214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070169 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:54.070214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070175 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:54.070214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070180 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:54.070214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070186 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:54.070214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070194 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:54.070214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070203 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:54.070463 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070223 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:54.070463 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070233 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:54.070463 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.070444 2560 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zwmcw" Apr 16 19:53:54.071150 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.071139 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:54.071187 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.071151 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:54.073854 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.073815 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:54.073915 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.073900 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-77.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:53:54.073946 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.073919 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-77.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:54.074558 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.074546 2560 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:54.074591 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.074581 2560 server.go:1295] "Started kubelet" Apr 16 19:53:54.074664 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.074641 2560 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:54.075081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.075043 2560 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:54.075130 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.075098 2560 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:54.075565 ip-10-0-131-77 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:54.076085 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.076068 2560 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:54.077055 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.077041 2560 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:54.080318 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.080294 2560 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zwmcw" Apr 16 19:53:54.081605 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.081583 2560 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:54.081729 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.081711 2560 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:54.082593 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.081073 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-77.ec2.internal.18a6ee6822f2627b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-77.ec2.internal,UID:ip-10-0-131-77.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-77.ec2.internal,},FirstTimestamp:2026-04-16 19:53:54.074559099 +0000 UTC m=+0.407940411,LastTimestamp:2026-04-16 19:53:54.074559099 +0000 UTC m=+0.407940411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-77.ec2.internal,}" Apr 16 19:53:54.082696 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.082611 2560 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:54.082696 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.082628 2560 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:54.082789 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.082721 2560 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:54.082931 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.082917 2560 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:54.082931 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.082931 2560 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:54.083554 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.083540 2560 factory.go:55] Registering systemd factory Apr 16 19:53:54.083645 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.083591 2560 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:54.083854 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.083837 2560 factory.go:153] Registering CRI-O factory Apr 16 19:53:54.083854 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.083854 2560 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:54.083932 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.083902 2560 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:54.083932 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.083924 2560 factory.go:103] Registering Raw factory Apr 16 19:53:54.083994 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.083935 2560 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:54.083994 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.083955 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.084509 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.084493 2560 manager.go:319] Starting recovery of all containers Apr 16 19:53:54.085475 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.085433 2560 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:54.093491 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.093330 2560 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:54.095035 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.095018 2560 manager.go:324] Recovery completed Apr 16 19:53:54.096439 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.096417 2560 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-77.ec2.internal\" not found" node="ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.100004 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.099991 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:54.103891 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.103871 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:54.103971 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.103903 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:54.103971 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.103914 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:54.104421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.104407 2560 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:54.104421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.104420 2560 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:54.104527 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.104435 2560 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:54.106834 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.106820 2560 policy_none.go:49] "None policy: Start" Apr 16 19:53:54.106834 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.106835 2560 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:54.106935 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.106844 2560 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:54.157660 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.157646 2560 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:54.158554 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.157678 2560 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:54.158554 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.157691 2560 server.go:85] "Starting device plugin registration server" Apr 16 19:53:54.158554 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.157922 2560 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:54.158554 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.157934 2560 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:54.158554 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.158027 2560 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:54.158554 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.158104 2560 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:54.158554 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.158129 2560 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:54.158823 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.158724 2560 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:54.158823 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.158762 2560 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.217190 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.217159 2560 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:54.218342 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.218328 2560 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:54.218405 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.218352 2560 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:54.218405 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.218370 2560 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:54.218405 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.218376 2560 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:54.218405 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.218405 2560 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:54.220872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.220850 2560 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:54.258982 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.258920 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:54.259803 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.259781 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:54.259893 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.259819 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:54.259893 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.259833 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:54.259893 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.259862 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.268085 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.268071 2560 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.268161 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.268092 2560 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-77.ec2.internal\": node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.279448 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.279431 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.318606 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.318584 2560 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal"] Apr 16 19:53:54.318673 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.318664 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:54.319490 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.319460 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:54.319550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.319505 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:54.319550 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.319519 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:54.322018 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.321992 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:54.322211 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.322195 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.322279 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.322230 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:54.322754 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.322728 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:54.322836 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.322758 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:54.322836 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.322769 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:54.322836 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.322732 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:54.322836 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.322835 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:54.322976 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.322848 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:54.324078 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.324057 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.324078 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.324081 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:54.324734 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.324720 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:54.324734 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.324741 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:54.324875 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.324754 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:54.343631 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.343613 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-77.ec2.internal\" not found" node="ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.347965 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.347951 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-77.ec2.internal\" not found" node="ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.379827 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.379804 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.384702 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.384684 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/31c74c94218edfc3256dbb69527ce761-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal\" (UID: \"31c74c94218edfc3256dbb69527ce761\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.384779 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.384710 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31c74c94218edfc3256dbb69527ce761-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal\" (UID: \"31c74c94218edfc3256dbb69527ce761\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.384779 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.384728 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13eeb99f39e0cf01e19afd359b85300f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-77.ec2.internal\" (UID: \"13eeb99f39e0cf01e19afd359b85300f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.480394 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.480363 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.485724 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.485708 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/31c74c94218edfc3256dbb69527ce761-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal\" (UID: \"31c74c94218edfc3256dbb69527ce761\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.485773 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.485734 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31c74c94218edfc3256dbb69527ce761-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal\" (UID: \"31c74c94218edfc3256dbb69527ce761\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.485773 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.485751 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13eeb99f39e0cf01e19afd359b85300f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-77.ec2.internal\" (UID: \"13eeb99f39e0cf01e19afd359b85300f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.485840 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.485826 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13eeb99f39e0cf01e19afd359b85300f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-77.ec2.internal\" (UID: \"13eeb99f39e0cf01e19afd359b85300f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.485873 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.485825 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/31c74c94218edfc3256dbb69527ce761-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal\" (UID: \"31c74c94218edfc3256dbb69527ce761\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.485873 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.485825 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31c74c94218edfc3256dbb69527ce761-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal\" (UID: \"31c74c94218edfc3256dbb69527ce761\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.581194 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.581105 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.645607 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.645567 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.651147 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.651128 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal" Apr 16 19:53:54.681352 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.681327 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.781865 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.781828 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.882316 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.882243 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.982779 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:54.982750 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-77.ec2.internal\" not found" Apr 16 19:53:54.998065 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.998043 2560 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:54.998203 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.998185 2560 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:54.998264 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:54.998213 2560 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:55.032381 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.032360 2560 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:55.062725 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.062703 2560 apiserver.go:52] "Watching apiserver" Apr 16 19:53:55.074196 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.074172 2560 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:53:55.074498 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.074477 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-k8ssj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5","openshift-multus/multus-additional-cni-plugins-s45wn","openshift-multus/multus-dhphh","openshift-network-diagnostics/network-check-target-vxhp2","openshift-cluster-node-tuning-operator/tuned-rx55c","openshift-image-registry/node-ca-w48w2","openshift-multus/network-metrics-daemon-nnh4p","openshift-network-operator/iptables-alerter-z9rpg","openshift-ovn-kubernetes/ovnkube-node-29x9d"] Apr 16 19:53:55.075771 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.075750 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:53:55.076937 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.076911 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.078060 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.078018 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.078432 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.078414 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:53:55.078572 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.078557 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:53:55.078907 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.078888 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jm84x\"" Apr 16 19:53:55.079546 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.079301 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:53:55.079546 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.079344 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:53:55.079546 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.079384 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:53:55.079546 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.079432 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rklwx\"" Apr 16 19:53:55.080344 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.080324 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:53:55.080446 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.080402 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:53:55.080446 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.080324 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:53:55.080590 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.080570 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.080702 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.080679 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ts624\"" Apr 16 19:53:55.080789 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.080762 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:53:55.080894 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.080876 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:53:55.081680 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.081662 2560 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:55.081783 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.081680 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:53:55.081783 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.081730 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:53:55.085860 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.083184 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.085860 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.083674 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:53:55.085860 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.083873 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:54 +0000 UTC" deadline="2028-01-18 01:28:15.439965286 +0000 UTC" Apr 16 19:53:55.085860 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.083895 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15389h34m20.3560735s" Apr 16 19:53:55.085860 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.084571 2560 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal" Apr 16 19:53:55.085860 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.084737 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qnz9r\"" Apr 16 19:53:55.086144 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.086038 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:55.086570 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.086551 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-k2m7p\"" Apr 16 19:53:55.086664 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.086617 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.086752 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.086735 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:55.087206 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.086839 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:53:55.087206 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.087013 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:55.088081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088066 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-host\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.088081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088075 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.088236 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088090 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-os-release\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.088236 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088141 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-os-release\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.088236 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088209 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-run-multus-certs\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.088387 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088244 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/17f8be1a-9a64-4b60-a18e-62b62402d4ed-agent-certs\") pod \"konnectivity-agent-k8ssj\" (UID: \"17f8be1a-9a64-4b60-a18e-62b62402d4ed\") " pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:53:55.088387 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088276 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-modprobe-d\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.088387 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088311 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-kubernetes\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.088387 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088336 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-sys\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.088564 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088364 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-system-cni-dir\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.088564 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088423 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-sys-fs\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.088564 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088474 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbrp9\" (UniqueName: \"kubernetes.io/projected/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-kube-api-access-qbrp9\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.088564 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088519 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hxv9\" (UniqueName: \"kubernetes.io/projected/b8893787-cf58-4e1c-a147-11f9472fb1bc-kube-api-access-2hxv9\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.088564 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088553 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mw9n\" (UniqueName: \"kubernetes.io/projected/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-kube-api-access-2mw9n\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.088776 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088572 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-cnibin\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.088776 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088591 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-run-netns\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.088776 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088606 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-hostroot\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.088776 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088621 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-etc-selinux\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.088776 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088640 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-cnibin\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.088776 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088674 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.088776 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088697 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.088776 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088721 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-conf-dir\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.088776 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088742 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-var-lib-cni-bin\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088784 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5bn\" (UniqueName: \"kubernetes.io/projected/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-kube-api-access-bs5bn\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088822 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-var-lib-kubelet\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088850 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/17f8be1a-9a64-4b60-a18e-62b62402d4ed-konnectivity-ca\") pod \"konnectivity-agent-k8ssj\" (UID: \"17f8be1a-9a64-4b60-a18e-62b62402d4ed\") " pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088866 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088884 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-socket-dir-parent\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088926 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088959 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-registration-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.088988 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089000 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089013 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-socket-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089035 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-sysctl-d\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089056 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-tuned\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089079 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-var-lib-cni-multus\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089121 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089103 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-device-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089163 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-sysctl-conf\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089186 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8-host\") pod \"node-ca-w48w2\" (UID: \"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8\") " pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.089273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089209 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089230 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-system-cni-dir\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089252 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-sysconfig\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089278 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlcgx\" (UniqueName: \"kubernetes.io/projected/038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8-kube-api-access-wlcgx\") pod \"node-ca-w48w2\" (UID: \"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8\") " pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089310 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-etc-kubernetes\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089280 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6fbqq\"" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089346 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-run\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089405 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-systemd\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089436 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-var-lib-kubelet\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089470 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-cni-dir\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089479 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089492 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af1645bd-5537-49e3-ae71-81cf97501bb8-cni-binary-copy\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089513 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-daemon-config\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089544 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8893787-cf58-4e1c-a147-11f9472fb1bc-tmp\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089570 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8-serviceca\") pod \"node-ca-w48w2\" (UID: \"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8\") " pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089594 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-run-k8s-cni-cncf-io\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089621 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-lib-modules\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089668 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.090097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089700 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9xbt\" (UniqueName: \"kubernetes.io/projected/af1645bd-5537-49e3-ae71-81cf97501bb8-kube-api-access-s9xbt\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.090961 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.089726 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:53:55.090961 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.090591 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:55.090961 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.090591 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nqwng\"" Apr 16 19:53:55.090961 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.090653 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:55.090961 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.090732 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:53:55.092367 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.092349 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:53:55.092469 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.092408 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:53:55.092735 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.092545 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal"] Apr 16 19:53:55.092735 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.092578 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:53:55.092735 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.092638 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s7ssk\"" Apr 16 19:53:55.092890 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.092813 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:53:55.092890 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.092820 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:53:55.092980 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.092968 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:53:55.093283 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.093262 2560 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:55.093380 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.093325 2560 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" Apr 16 19:53:55.098946 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.098928 2560 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:55.103312 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.103296 2560 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:55.103465 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.103451 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal"] Apr 16 19:53:55.105279 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.105263 2560 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:55.114891 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.114829 2560 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cnnmm" Apr 16 19:53:55.123458 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.123439 2560 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cnnmm" Apr 16 19:53:55.168893 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.168860 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13eeb99f39e0cf01e19afd359b85300f.slice/crio-a42b9368d9cf30b51b090167d2d44ec90c25ade675daf8f83eb82cb9b286dedf WatchSource:0}: Error finding container a42b9368d9cf30b51b090167d2d44ec90c25ade675daf8f83eb82cb9b286dedf: Status 404 returned error can't find the container with id a42b9368d9cf30b51b090167d2d44ec90c25ade675daf8f83eb82cb9b286dedf Apr 16 19:53:55.169084 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.169063 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c74c94218edfc3256dbb69527ce761.slice/crio-55d22f806775740d28e2c536538de86f76e887362fb37a841c4e31c8eaf0d4b2 WatchSource:0}: Error finding container 55d22f806775740d28e2c536538de86f76e887362fb37a841c4e31c8eaf0d4b2: Status 404 returned error can't find the container with id 55d22f806775740d28e2c536538de86f76e887362fb37a841c4e31c8eaf0d4b2 Apr 16 19:53:55.174348 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.174334 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:53:55.183947 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.183930 2560 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:53:55.189857 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.189837 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.189943 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.189862 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-conf-dir\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.189943 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.189884 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.189943 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.189901 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-var-lib-cni-bin\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.189943 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.189917 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5bn\" (UniqueName: \"kubernetes.io/projected/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-kube-api-access-bs5bn\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:55.190172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.189947 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-conf-dir\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.190172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.189967 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-var-lib-cni-bin\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.190172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.189994 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-log-socket\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.190172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190027 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dea9bf41-88be-4138-b6f1-4334d36c5ca3-ovnkube-config\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.190172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190055 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-var-lib-kubelet\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.190172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190083 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/17f8be1a-9a64-4b60-a18e-62b62402d4ed-konnectivity-ca\") pod \"konnectivity-agent-k8ssj\" (UID: \"17f8be1a-9a64-4b60-a18e-62b62402d4ed\") " pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:53:55.190172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190156 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-var-lib-kubelet\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.190172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190159 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wflwl\" (UniqueName: \"kubernetes.io/projected/a67ff119-3c66-4d24-b286-d876902613ad-kube-api-access-wflwl\") pod \"iptables-alerter-z9rpg\" (UID: \"a67ff119-3c66-4d24-b286-d876902613ad\") " pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190196 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-kubelet\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190222 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-run-ovn\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190244 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dea9bf41-88be-4138-b6f1-4334d36c5ca3-env-overrides\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190266 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dea9bf41-88be-4138-b6f1-4334d36c5ca3-ovn-node-metrics-cert\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190296 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-socket-dir-parent\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190323 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190376 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-socket-dir-parent\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190373 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-registration-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190392 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190421 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190430 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-registration-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190453 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:55.190505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190483 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-run-netns\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190515 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-socket-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190549 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-sysctl-d\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.190564 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190569 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/17f8be1a-9a64-4b60-a18e-62b62402d4ed-konnectivity-ca\") pod \"konnectivity-agent-k8ssj\" (UID: \"17f8be1a-9a64-4b60-a18e-62b62402d4ed\") " pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190581 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-tuned\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.190615 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs podName:e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:55.690592091 +0000 UTC m=+2.023973393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs") pod "network-metrics-daemon-nnh4p" (UID: "e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190641 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-socket-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190684 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-systemd-units\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190704 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-var-lib-cni-multus\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190711 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-sysctl-d\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190722 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-device-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190753 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-sysctl-conf\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190763 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-device-dir\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190779 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8-host\") pod \"node-ca-w48w2\" (UID: \"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8\") " pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190777 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-var-lib-cni-multus\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190813 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.190945 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190822 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8-host\") pod \"node-ca-w48w2\" (UID: \"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8\") " pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190837 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-system-cni-dir\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190844 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-sysctl-conf\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190861 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-sysconfig\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190851 2560 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190885 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-system-cni-dir\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190889 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlcgx\" (UniqueName: \"kubernetes.io/projected/038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8-kube-api-access-wlcgx\") pod \"node-ca-w48w2\" (UID: \"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8\") " pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190915 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-cni-bin\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190938 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-etc-kubernetes\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190951 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-sysconfig\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190963 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-run\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190988 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-var-lib-openvswitch\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.190994 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-etc-kubernetes\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191016 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-systemd\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191019 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-run\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191063 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-var-lib-kubelet\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191091 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-slash\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191136 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-cni-dir\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.191614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191075 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-systemd\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191161 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af1645bd-5537-49e3-ae71-81cf97501bb8-cni-binary-copy\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191167 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-var-lib-kubelet\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191187 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-daemon-config\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191210 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8893787-cf58-4e1c-a147-11f9472fb1bc-tmp\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191218 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-cni-dir\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191233 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8-serviceca\") pod \"node-ca-w48w2\" (UID: \"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8\") " pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191259 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-run-k8s-cni-cncf-io\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191283 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-lib-modules\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191300 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191310 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dea9bf41-88be-4138-b6f1-4334d36c5ca3-ovnkube-script-lib\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191337 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191375 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-run-k8s-cni-cncf-io\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191365 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9xbt\" (UniqueName: \"kubernetes.io/projected/af1645bd-5537-49e3-ae71-81cf97501bb8-kube-api-access-s9xbt\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191415 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191437 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-host\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191463 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a67ff119-3c66-4d24-b286-d876902613ad-iptables-alerter-script\") pod \"iptables-alerter-z9rpg\" (UID: \"a67ff119-3c66-4d24-b286-d876902613ad\") " pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.192210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191487 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-run-ovn-kubernetes\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191512 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnv9t\" (UniqueName: \"kubernetes.io/projected/dea9bf41-88be-4138-b6f1-4334d36c5ca3-kube-api-access-jnv9t\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191556 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-os-release\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191603 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-os-release\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191648 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-run-multus-certs\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191681 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/17f8be1a-9a64-4b60-a18e-62b62402d4ed-agent-certs\") pod \"konnectivity-agent-k8ssj\" (UID: \"17f8be1a-9a64-4b60-a18e-62b62402d4ed\") " pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191706 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-etc-openvswitch\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191717 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af1645bd-5537-49e3-ae71-81cf97501bb8-cni-binary-copy\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191731 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af1645bd-5537-49e3-ae71-81cf97501bb8-multus-daemon-config\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191740 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-cni-netd\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191782 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8-serviceca\") pod \"node-ca-w48w2\" (UID: \"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8\") " pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191790 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-run-multus-certs\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191774 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-modprobe-d\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191830 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-kubernetes\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191855 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-lib-modules\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191871 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-modprobe-d\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191907 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-sys\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191935 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-os-release\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.193037 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191856 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-sys\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.191982 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-os-release\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192025 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-host\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192028 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-kubernetes\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192102 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-system-cni-dir\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192233 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-sys-fs\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192242 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-system-cni-dir\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192274 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbrp9\" (UniqueName: \"kubernetes.io/projected/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-kube-api-access-qbrp9\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192307 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hxv9\" (UniqueName: \"kubernetes.io/projected/b8893787-cf58-4e1c-a147-11f9472fb1bc-kube-api-access-2hxv9\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192320 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-sys-fs\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192335 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a67ff119-3c66-4d24-b286-d876902613ad-host-slash\") pod \"iptables-alerter-z9rpg\" (UID: \"a67ff119-3c66-4d24-b286-d876902613ad\") " pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192361 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-run-openvswitch\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192383 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-node-log\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192411 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mw9n\" (UniqueName: \"kubernetes.io/projected/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-kube-api-access-2mw9n\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192436 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-cnibin\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192451 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192461 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-run-netns\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.193856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192485 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-hostroot\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192511 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-etc-selinux\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192537 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-run-systemd\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192544 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-cnibin\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192567 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-cnibin\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192592 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-hostroot\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192630 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192635 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af1645bd-5537-49e3-ae71-81cf97501bb8-host-run-netns\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192713 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-cnibin\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192763 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.192792 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-etc-selinux\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.194444 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8893787-cf58-4e1c-a147-11f9472fb1bc-tmp\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.194531 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.194473 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b8893787-cf58-4e1c-a147-11f9472fb1bc-etc-tuned\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.194845 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.194665 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/17f8be1a-9a64-4b60-a18e-62b62402d4ed-agent-certs\") pod \"konnectivity-agent-k8ssj\" (UID: \"17f8be1a-9a64-4b60-a18e-62b62402d4ed\") " pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:53:55.197639 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.197617 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5bn\" (UniqueName: \"kubernetes.io/projected/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-kube-api-access-bs5bn\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:55.198450 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.198425 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:55.198543 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.198454 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:55.198543 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.198467 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7lx2j for pod openshift-network-diagnostics/network-check-target-vxhp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:55.198620 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.198599 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j podName:e5be099e-d9c4-4a29-af14-f803d80a9636 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:55.698582715 +0000 UTC m=+2.031964038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7lx2j" (UniqueName: "kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j") pod "network-check-target-vxhp2" (UID: "e5be099e-d9c4-4a29-af14-f803d80a9636") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:55.200397 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.200372 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hxv9\" (UniqueName: \"kubernetes.io/projected/b8893787-cf58-4e1c-a147-11f9472fb1bc-kube-api-access-2hxv9\") pod \"tuned-rx55c\" (UID: \"b8893787-cf58-4e1c-a147-11f9472fb1bc\") " pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.200520 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.200503 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9xbt\" (UniqueName: \"kubernetes.io/projected/af1645bd-5537-49e3-ae71-81cf97501bb8-kube-api-access-s9xbt\") pod \"multus-dhphh\" (UID: \"af1645bd-5537-49e3-ae71-81cf97501bb8\") " pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.200583 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.200536 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlcgx\" (UniqueName: \"kubernetes.io/projected/038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8-kube-api-access-wlcgx\") pod \"node-ca-w48w2\" (UID: \"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8\") " pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.200643 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.200613 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mw9n\" (UniqueName: \"kubernetes.io/projected/2535cb5c-f95c-424b-a266-b74f5c7f4b0b-kube-api-access-2mw9n\") pod \"multus-additional-cni-plugins-s45wn\" (UID: \"2535cb5c-f95c-424b-a266-b74f5c7f4b0b\") " pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.200677 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.200638 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbrp9\" (UniqueName: \"kubernetes.io/projected/73997f5d-4fb4-4fd0-b12c-ac04bb360d46-kube-api-access-qbrp9\") pod \"aws-ebs-csi-driver-node-kbbr5\" (UID: \"73997f5d-4fb4-4fd0-b12c-ac04bb360d46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.221934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.221897 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" event={"ID":"31c74c94218edfc3256dbb69527ce761","Type":"ContainerStarted","Data":"55d22f806775740d28e2c536538de86f76e887362fb37a841c4e31c8eaf0d4b2"} Apr 16 19:53:55.222790 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.222772 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal" event={"ID":"13eeb99f39e0cf01e19afd359b85300f","Type":"ContainerStarted","Data":"a42b9368d9cf30b51b090167d2d44ec90c25ade675daf8f83eb82cb9b286dedf"} Apr 16 19:53:55.293096 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293072 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293194 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293100 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-log-socket\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293194 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293131 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dea9bf41-88be-4138-b6f1-4334d36c5ca3-ovnkube-config\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293194 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293157 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wflwl\" (UniqueName: \"kubernetes.io/projected/a67ff119-3c66-4d24-b286-d876902613ad-kube-api-access-wflwl\") pod \"iptables-alerter-z9rpg\" (UID: \"a67ff119-3c66-4d24-b286-d876902613ad\") " pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.293194 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293176 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293194 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293192 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-kubelet\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293201 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-log-socket\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293217 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-run-ovn\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293238 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dea9bf41-88be-4138-b6f1-4334d36c5ca3-env-overrides\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293259 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-kubelet\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293263 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dea9bf41-88be-4138-b6f1-4334d36c5ca3-ovn-node-metrics-cert\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293273 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-run-ovn\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293334 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-run-netns\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293361 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-systemd-units\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293391 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-run-netns\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293430 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-cni-bin\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293434 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-systemd-units\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293437 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293395 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-cni-bin\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293469 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-var-lib-openvswitch\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293496 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-slash\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293527 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dea9bf41-88be-4138-b6f1-4334d36c5ca3-ovnkube-script-lib\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293565 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a67ff119-3c66-4d24-b286-d876902613ad-iptables-alerter-script\") pod \"iptables-alerter-z9rpg\" (UID: \"a67ff119-3c66-4d24-b286-d876902613ad\") " pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293582 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-var-lib-openvswitch\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293590 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-run-ovn-kubernetes\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293586 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-slash\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293618 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnv9t\" (UniqueName: \"kubernetes.io/projected/dea9bf41-88be-4138-b6f1-4334d36c5ca3-kube-api-access-jnv9t\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293648 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-etc-openvswitch\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293659 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-run-ovn-kubernetes\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293673 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-cni-netd\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293708 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a67ff119-3c66-4d24-b286-d876902613ad-host-slash\") pod \"iptables-alerter-z9rpg\" (UID: \"a67ff119-3c66-4d24-b286-d876902613ad\") " pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293710 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dea9bf41-88be-4138-b6f1-4334d36c5ca3-env-overrides\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293728 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dea9bf41-88be-4138-b6f1-4334d36c5ca3-ovnkube-config\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293732 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-run-openvswitch\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293776 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a67ff119-3c66-4d24-b286-d876902613ad-host-slash\") pod \"iptables-alerter-z9rpg\" (UID: \"a67ff119-3c66-4d24-b286-d876902613ad\") " pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.293941 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293782 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-etc-openvswitch\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.294421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293777 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-node-log\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.294421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293812 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-node-log\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.294421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293821 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-host-cni-netd\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.294421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293843 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-run-systemd\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.294421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293848 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-run-openvswitch\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.294421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.293893 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dea9bf41-88be-4138-b6f1-4334d36c5ca3-run-systemd\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.294421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.294070 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dea9bf41-88be-4138-b6f1-4334d36c5ca3-ovnkube-script-lib\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.294421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.294071 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a67ff119-3c66-4d24-b286-d876902613ad-iptables-alerter-script\") pod \"iptables-alerter-z9rpg\" (UID: \"a67ff119-3c66-4d24-b286-d876902613ad\") " pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.295231 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.295210 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dea9bf41-88be-4138-b6f1-4334d36c5ca3-ovn-node-metrics-cert\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.300197 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.300180 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wflwl\" (UniqueName: \"kubernetes.io/projected/a67ff119-3c66-4d24-b286-d876902613ad-kube-api-access-wflwl\") pod \"iptables-alerter-z9rpg\" (UID: \"a67ff119-3c66-4d24-b286-d876902613ad\") " pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.300727 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.300706 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnv9t\" (UniqueName: \"kubernetes.io/projected/dea9bf41-88be-4138-b6f1-4334d36c5ca3-kube-api-access-jnv9t\") pod \"ovnkube-node-29x9d\" (UID: \"dea9bf41-88be-4138-b6f1-4334d36c5ca3\") " pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.402818 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.402770 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:53:55.409402 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.409382 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f8be1a_9a64_4b60_a18e_62b62402d4ed.slice/crio-72a60f3744fc66070b170cfb4dd2a87b511a6cc09b6b96c9cac874d843bc2267 WatchSource:0}: Error finding container 72a60f3744fc66070b170cfb4dd2a87b511a6cc09b6b96c9cac874d843bc2267: Status 404 returned error can't find the container with id 72a60f3744fc66070b170cfb4dd2a87b511a6cc09b6b96c9cac874d843bc2267 Apr 16 19:53:55.420801 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.420779 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" Apr 16 19:53:55.426096 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.426071 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73997f5d_4fb4_4fd0_b12c_ac04bb360d46.slice/crio-48ac14ec39e8dc591120ede1d5c4e25c32c6a54bc7b3a74d076e658eb2021981 WatchSource:0}: Error finding container 48ac14ec39e8dc591120ede1d5c4e25c32c6a54bc7b3a74d076e658eb2021981: Status 404 returned error can't find the container with id 48ac14ec39e8dc591120ede1d5c4e25c32c6a54bc7b3a74d076e658eb2021981 Apr 16 19:53:55.433616 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.433598 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s45wn" Apr 16 19:53:55.437176 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.437157 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dhphh" Apr 16 19:53:55.439200 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.439182 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2535cb5c_f95c_424b_a266_b74f5c7f4b0b.slice/crio-9fd3d4c4cf47290eff9482614114e65413e8392b5f504a483207c69c002f596b WatchSource:0}: Error finding container 9fd3d4c4cf47290eff9482614114e65413e8392b5f504a483207c69c002f596b: Status 404 returned error can't find the container with id 9fd3d4c4cf47290eff9482614114e65413e8392b5f504a483207c69c002f596b Apr 16 19:53:55.443545 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.443528 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf1645bd_5537_49e3_ae71_81cf97501bb8.slice/crio-e2807227e4d4ebe3f5c0cad65b5cca38c1517ca7e6c7cd53e6a1248a8356d2fc WatchSource:0}: Error finding container e2807227e4d4ebe3f5c0cad65b5cca38c1517ca7e6c7cd53e6a1248a8356d2fc: Status 404 returned error can't find the container with id e2807227e4d4ebe3f5c0cad65b5cca38c1517ca7e6c7cd53e6a1248a8356d2fc Apr 16 19:53:55.467884 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.467865 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rx55c" Apr 16 19:53:55.473406 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.473384 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w48w2" Apr 16 19:53:55.473723 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.473698 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8893787_cf58_4e1c_a147_11f9472fb1bc.slice/crio-3a9bc1565bd72b077d854f30299994f7455a5aadd63e00a4b51db9d6ef712798 WatchSource:0}: Error finding container 3a9bc1565bd72b077d854f30299994f7455a5aadd63e00a4b51db9d6ef712798: Status 404 returned error can't find the container with id 3a9bc1565bd72b077d854f30299994f7455a5aadd63e00a4b51db9d6ef712798 Apr 16 19:53:55.479153 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.479133 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z9rpg" Apr 16 19:53:55.479430 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.479410 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod038b7332_e7c9_4001_a9ec_bfe5e7d7e1c8.slice/crio-ac56be44db993b6171229e4c38507f32cad23a686129c923d2be23762ac0dec4 WatchSource:0}: Error finding container ac56be44db993b6171229e4c38507f32cad23a686129c923d2be23762ac0dec4: Status 404 returned error can't find the container with id ac56be44db993b6171229e4c38507f32cad23a686129c923d2be23762ac0dec4 Apr 16 19:53:55.484016 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.483992 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:53:55.484929 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.484907 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67ff119_3c66_4d24_b286_d876902613ad.slice/crio-17db3b56febf49e7972b39b998e26eb897c68511f540c32affdfac846c19e07e WatchSource:0}: Error finding container 17db3b56febf49e7972b39b998e26eb897c68511f540c32affdfac846c19e07e: Status 404 returned error can't find the container with id 17db3b56febf49e7972b39b998e26eb897c68511f540c32affdfac846c19e07e Apr 16 19:53:55.490009 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:53:55.489974 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea9bf41_88be_4138_b6f1_4334d36c5ca3.slice/crio-43a55044088cb0bdc5b27166bf8fabec6e7d880210a5fa695b2ed90325459107 WatchSource:0}: Error finding container 43a55044088cb0bdc5b27166bf8fabec6e7d880210a5fa695b2ed90325459107: Status 404 returned error can't find the container with id 43a55044088cb0bdc5b27166bf8fabec6e7d880210a5fa695b2ed90325459107 Apr 16 19:53:55.696133 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.696042 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:55.696280 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.696194 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:55.696280 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.696263 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs podName:e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:56.696245111 +0000 UTC m=+3.029626411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs") pod "network-metrics-daemon-nnh4p" (UID: "e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:55.796453 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:55.796418 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:53:55.796640 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.796587 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:55.796640 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.796605 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:55.796640 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.796617 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7lx2j for pod openshift-network-diagnostics/network-check-target-vxhp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:55.796852 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:55.796671 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j podName:e5be099e-d9c4-4a29-af14-f803d80a9636 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:56.79665297 +0000 UTC m=+3.130034273 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lx2j" (UniqueName: "kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j") pod "network-check-target-vxhp2" (UID: "e5be099e-d9c4-4a29-af14-f803d80a9636") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:56.124702 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.124604 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:55 +0000 UTC" deadline="2027-12-28 04:48:04.892511574 +0000 UTC" Apr 16 19:53:56.124702 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.124651 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14888h54m8.767863811s" Apr 16 19:53:56.233163 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.233060 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" event={"ID":"dea9bf41-88be-4138-b6f1-4334d36c5ca3","Type":"ContainerStarted","Data":"43a55044088cb0bdc5b27166bf8fabec6e7d880210a5fa695b2ed90325459107"} Apr 16 19:53:56.239827 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.239784 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w48w2" event={"ID":"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8","Type":"ContainerStarted","Data":"ac56be44db993b6171229e4c38507f32cad23a686129c923d2be23762ac0dec4"} Apr 16 19:53:56.241826 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.241759 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dhphh" event={"ID":"af1645bd-5537-49e3-ae71-81cf97501bb8","Type":"ContainerStarted","Data":"e2807227e4d4ebe3f5c0cad65b5cca38c1517ca7e6c7cd53e6a1248a8356d2fc"} Apr 16 19:53:56.244257 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.244212 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-k8ssj" event={"ID":"17f8be1a-9a64-4b60-a18e-62b62402d4ed","Type":"ContainerStarted","Data":"72a60f3744fc66070b170cfb4dd2a87b511a6cc09b6b96c9cac874d843bc2267"} Apr 16 19:53:56.249788 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.249743 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z9rpg" event={"ID":"a67ff119-3c66-4d24-b286-d876902613ad","Type":"ContainerStarted","Data":"17db3b56febf49e7972b39b998e26eb897c68511f540c32affdfac846c19e07e"} Apr 16 19:53:56.254567 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.254544 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rx55c" event={"ID":"b8893787-cf58-4e1c-a147-11f9472fb1bc","Type":"ContainerStarted","Data":"3a9bc1565bd72b077d854f30299994f7455a5aadd63e00a4b51db9d6ef712798"} Apr 16 19:53:56.257215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.257189 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s45wn" event={"ID":"2535cb5c-f95c-424b-a266-b74f5c7f4b0b","Type":"ContainerStarted","Data":"9fd3d4c4cf47290eff9482614114e65413e8392b5f504a483207c69c002f596b"} Apr 16 19:53:56.261680 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.261618 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" event={"ID":"73997f5d-4fb4-4fd0-b12c-ac04bb360d46","Type":"ContainerStarted","Data":"48ac14ec39e8dc591120ede1d5c4e25c32c6a54bc7b3a74d076e658eb2021981"} Apr 16 19:53:56.314307 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.314279 2560 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:56.337606 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.337392 2560 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:56.603560 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.603460 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5gp27"] Apr 16 19:53:56.605477 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.605454 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:56.605616 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:56.605534 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:53:56.703138 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.703030 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/243af810-99f2-40cb-b920-2355426fbf4e-dbus\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:56.703138 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.703080 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:56.703356 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.703157 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/243af810-99f2-40cb-b920-2355426fbf4e-kubelet-config\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:56.703356 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.703206 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:56.703356 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:56.703333 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:56.703524 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:56.703393 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs podName:e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.703375982 +0000 UTC m=+5.036757292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs") pod "network-metrics-daemon-nnh4p" (UID: "e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.804079 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/243af810-99f2-40cb-b920-2355426fbf4e-kubelet-config\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.804183 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/243af810-99f2-40cb-b920-2355426fbf4e-dbus\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.804212 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.804245 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:56.804382 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:56.804400 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:56.804412 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7lx2j for pod openshift-network-diagnostics/network-check-target-vxhp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:56.804466 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j podName:e5be099e-d9c4-4a29-af14-f803d80a9636 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.8044496 +0000 UTC m=+5.137830905 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lx2j" (UniqueName: "kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j") pod "network-check-target-vxhp2" (UID: "e5be099e-d9c4-4a29-af14-f803d80a9636") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.804864 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/243af810-99f2-40cb-b920-2355426fbf4e-kubelet-config\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:56.805016 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/243af810-99f2-40cb-b920-2355426fbf4e-dbus\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:56.805129 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:56.805213 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:56.805180 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret podName:243af810-99f2-40cb-b920-2355426fbf4e nodeName:}" failed. No retries permitted until 2026-04-16 19:53:57.305161077 +0000 UTC m=+3.638542379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret") pod "global-pull-secret-syncer-5gp27" (UID: "243af810-99f2-40cb-b920-2355426fbf4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:57.125205 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:57.125144 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:55 +0000 UTC" deadline="2027-09-13 07:51:18.650717008 +0000 UTC" Apr 16 19:53:57.125205 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:57.125185 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12347h57m21.525535613s" Apr 16 19:53:57.219403 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:57.219372 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:57.219563 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:57.219513 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:53:57.219979 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:57.219957 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:53:57.220090 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:57.220056 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:53:57.308478 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:57.308438 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:57.308701 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:57.308683 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:57.308791 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:57.308769 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret podName:243af810-99f2-40cb-b920-2355426fbf4e nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.30875049 +0000 UTC m=+4.642131803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret") pod "global-pull-secret-syncer-5gp27" (UID: "243af810-99f2-40cb-b920-2355426fbf4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.222137 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:58.221735 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:58.222137 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:58.221862 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:53:58.317492 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:58.317452 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:53:58.317682 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:58.317666 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.317749 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:58.317733 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret podName:243af810-99f2-40cb-b920-2355426fbf4e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:00.317715066 +0000 UTC m=+6.651096378 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret") pod "global-pull-secret-syncer-5gp27" (UID: "243af810-99f2-40cb-b920-2355426fbf4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.721629 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:58.721543 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:58.721775 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:58.721734 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.721845 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:58.721799 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs podName:e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:02.721779464 +0000 UTC m=+9.055160777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs") pod "network-metrics-daemon-nnh4p" (UID: "e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.822215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:58.822139 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:53:58.822366 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:58.822346 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:58.822437 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:58.822370 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:58.822437 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:58.822386 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7lx2j for pod openshift-network-diagnostics/network-check-target-vxhp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.822534 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:58.822458 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j podName:e5be099e-d9c4-4a29-af14-f803d80a9636 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:02.822437312 +0000 UTC m=+9.155818634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lx2j" (UniqueName: "kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j") pod "network-check-target-vxhp2" (UID: "e5be099e-d9c4-4a29-af14-f803d80a9636") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.220005 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:59.219554 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:53:59.220005 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:59.219682 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:53:59.220005 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:53:59.219557 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:53:59.220005 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:53:59.219996 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:00.219306 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:00.219272 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:00.219772 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:00.219411 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:00.334056 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:00.334017 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:00.334288 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:00.334260 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:00.334412 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:00.334347 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret podName:243af810-99f2-40cb-b920-2355426fbf4e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.334323666 +0000 UTC m=+10.667704970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret") pod "global-pull-secret-syncer-5gp27" (UID: "243af810-99f2-40cb-b920-2355426fbf4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.218695 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:01.218659 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:01.218900 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:01.218857 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:01.219011 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:01.218986 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:01.219105 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:01.219081 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:02.219142 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:02.219095 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:02.219556 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:02.219258 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:02.754230 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:02.754190 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:02.754402 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:02.754376 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:02.754459 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:02.754442 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs podName:e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:10.754421802 +0000 UTC m=+17.087803111 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs") pod "network-metrics-daemon-nnh4p" (UID: "e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:02.856172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:02.855591 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:02.856172 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:02.855747 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:02.856172 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:02.855769 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:02.856172 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:02.855782 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7lx2j for pod openshift-network-diagnostics/network-check-target-vxhp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:02.856172 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:02.855843 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j podName:e5be099e-d9c4-4a29-af14-f803d80a9636 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:10.855825557 +0000 UTC m=+17.189206861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lx2j" (UniqueName: "kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j") pod "network-check-target-vxhp2" (UID: "e5be099e-d9c4-4a29-af14-f803d80a9636") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:03.218886 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:03.218806 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:03.219041 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:03.218949 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:03.219336 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:03.219318 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:03.219667 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:03.219415 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:04.220398 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:04.220359 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:04.220833 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:04.220487 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:04.367022 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:04.366965 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:04.367222 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:04.367178 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:04.367304 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:04.367251 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret podName:243af810-99f2-40cb-b920-2355426fbf4e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:12.367229816 +0000 UTC m=+18.700611128 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret") pod "global-pull-secret-syncer-5gp27" (UID: "243af810-99f2-40cb-b920-2355426fbf4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:05.218828 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:05.218567 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:05.218828 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:05.218567 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:05.219058 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:05.218878 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:05.219058 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:05.218954 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:06.219195 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:06.219162 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:06.219590 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:06.219293 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:07.218570 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:07.218530 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:07.218730 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:07.218530 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:07.218730 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:07.218657 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:07.218833 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:07.218756 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:08.219178 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:08.219139 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:08.219533 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:08.219251 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:09.218692 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:09.218664 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:09.218862 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:09.218665 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:09.218862 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:09.218780 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:09.218969 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:09.218887 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:10.219346 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:10.219316 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:10.219797 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:10.219423 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:10.816878 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:10.816844 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:10.817099 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:10.816965 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:10.817099 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:10.817025 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs podName:e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.817009515 +0000 UTC m=+33.150390814 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs") pod "network-metrics-daemon-nnh4p" (UID: "e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:10.918207 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:10.918171 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:10.918392 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:10.918358 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:10.918392 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:10.918384 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:10.918504 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:10.918397 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7lx2j for pod openshift-network-diagnostics/network-check-target-vxhp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:10.918504 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:10.918459 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j podName:e5be099e-d9c4-4a29-af14-f803d80a9636 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.918442009 +0000 UTC m=+33.251823313 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lx2j" (UniqueName: "kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j") pod "network-check-target-vxhp2" (UID: "e5be099e-d9c4-4a29-af14-f803d80a9636") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:11.218809 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:11.218711 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:11.218949 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:11.218711 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:11.218949 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:11.218845 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:11.218949 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:11.218901 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:12.219278 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:12.219242 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:12.219757 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:12.219370 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:12.430081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:12.430040 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:12.430259 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:12.430202 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:12.430317 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:12.430264 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret podName:243af810-99f2-40cb-b920-2355426fbf4e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:28.43024802 +0000 UTC m=+34.763629336 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret") pod "global-pull-secret-syncer-5gp27" (UID: "243af810-99f2-40cb-b920-2355426fbf4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:13.219302 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:13.219269 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:13.219657 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:13.219277 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:13.219657 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:13.219382 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:13.219657 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:13.219462 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:13.297044 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:13.296970 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal" event={"ID":"13eeb99f39e0cf01e19afd359b85300f","Type":"ContainerStarted","Data":"35651f976f4179881e5cfb2a696df2e338811d6888e5f0535ca350e573f94ea5"} Apr 16 19:54:13.304045 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:13.304022 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rx55c" event={"ID":"b8893787-cf58-4e1c-a147-11f9472fb1bc","Type":"ContainerStarted","Data":"5e8dc7e3509ee2bbfedabea24c204953ea152f322da526191bc59886f81bc146"} Apr 16 19:54:13.305245 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:13.305224 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dhphh" event={"ID":"af1645bd-5537-49e3-ae71-81cf97501bb8","Type":"ContainerStarted","Data":"72351e3eb5c45e6f5d8edc03171f98f20d65c663318752f5bdfa5b7dba52a2ee"} Apr 16 19:54:13.309186 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:13.309149 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-77.ec2.internal" podStartSLOduration=18.30913819 podStartE2EDuration="18.30913819s" podCreationTimestamp="2026-04-16 19:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:13.308929863 +0000 UTC m=+19.642311194" watchObservedRunningTime="2026-04-16 19:54:13.30913819 +0000 UTC m=+19.642519511" Apr 16 19:54:13.322455 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:13.322415 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rx55c" podStartSLOduration=1.850068813 podStartE2EDuration="19.322405124s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:53:55.475378509 +0000 UTC m=+1.808759808" lastFinishedPulling="2026-04-16 19:54:12.947714817 +0000 UTC m=+19.281096119" observedRunningTime="2026-04-16 19:54:13.322022828 +0000 UTC m=+19.655404149" watchObservedRunningTime="2026-04-16 19:54:13.322405124 +0000 UTC m=+19.655786445" Apr 16 19:54:13.335572 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:13.335533 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dhphh" podStartSLOduration=1.8053420999999998 podStartE2EDuration="19.335519332s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:53:55.445271549 +0000 UTC m=+1.778652856" lastFinishedPulling="2026-04-16 19:54:12.975448775 +0000 UTC m=+19.308830088" observedRunningTime="2026-04-16 19:54:13.335175019 +0000 UTC m=+19.668556340" watchObservedRunningTime="2026-04-16 19:54:13.335519332 +0000 UTC m=+19.668900653" Apr 16 19:54:14.219779 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.219746 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:14.220506 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:14.219838 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:14.307492 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.307261 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w48w2" event={"ID":"038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8","Type":"ContainerStarted","Data":"b7add1af71d40a04298083fc80916a2282c424a224cc1afde4871ef1208bbb0a"} Apr 16 19:54:14.308540 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.308517 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-k8ssj" event={"ID":"17f8be1a-9a64-4b60-a18e-62b62402d4ed","Type":"ContainerStarted","Data":"5e4948e1923cab5df6af0448f465e86e1ffa24cf364548a9c3a1cd9fa472228f"} Apr 16 19:54:14.309772 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.309752 2560 generic.go:358] "Generic (PLEG): container finished" podID="31c74c94218edfc3256dbb69527ce761" containerID="780038f1c8645578772a71fd2e63dc10434cf66a2466f04ab917c6a3262ba98f" exitCode=0 Apr 16 19:54:14.309845 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.309811 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" event={"ID":"31c74c94218edfc3256dbb69527ce761","Type":"ContainerDied","Data":"780038f1c8645578772a71fd2e63dc10434cf66a2466f04ab917c6a3262ba98f"} Apr 16 19:54:14.311003 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.310976 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z9rpg" event={"ID":"a67ff119-3c66-4d24-b286-d876902613ad","Type":"ContainerStarted","Data":"01932a36e74695d8b954963aeb7bc7e9aad71faf6240bec98e90e48186240882"} Apr 16 19:54:14.312226 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.312205 2560 generic.go:358] "Generic (PLEG): container finished" podID="2535cb5c-f95c-424b-a266-b74f5c7f4b0b" containerID="cd4c2c21d28764de35930867025c4a18447ab5f1fc90f135ce14cd6040e53cf5" exitCode=0 Apr 16 19:54:14.312309 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.312260 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s45wn" event={"ID":"2535cb5c-f95c-424b-a266-b74f5c7f4b0b","Type":"ContainerDied","Data":"cd4c2c21d28764de35930867025c4a18447ab5f1fc90f135ce14cd6040e53cf5"} Apr 16 19:54:14.313567 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.313490 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" event={"ID":"73997f5d-4fb4-4fd0-b12c-ac04bb360d46","Type":"ContainerStarted","Data":"22bc43306179c6148ad911e232c623e084b04334d51a4345bd6a49ab481a1e58"} Apr 16 19:54:14.315996 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.315975 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" event={"ID":"dea9bf41-88be-4138-b6f1-4334d36c5ca3","Type":"ContainerStarted","Data":"cb1e457536608fda87706b333bfaf20a21772d9951604da3d9468e346acfdf1c"} Apr 16 19:54:14.316075 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.316005 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" event={"ID":"dea9bf41-88be-4138-b6f1-4334d36c5ca3","Type":"ContainerStarted","Data":"37d500ed1e6d5a7f815e44a6c5d8b279e62df74fa1e8266761631922ee269e89"} Apr 16 19:54:14.316075 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.316024 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" event={"ID":"dea9bf41-88be-4138-b6f1-4334d36c5ca3","Type":"ContainerStarted","Data":"cffcd61ff7901e72158fa5a73aff3093b71afdb2587de581754287d43fcfeaab"} Apr 16 19:54:14.316075 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.316035 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" event={"ID":"dea9bf41-88be-4138-b6f1-4334d36c5ca3","Type":"ContainerStarted","Data":"792113c0d1f083596f38a4a3962b8631ab2cf279b24d92e7500a89b2bae15a3c"} Apr 16 19:54:14.316075 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.316046 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" event={"ID":"dea9bf41-88be-4138-b6f1-4334d36c5ca3","Type":"ContainerStarted","Data":"9286b824227096b0c6b604f7eaa488ba2a3b084b2d6b89789ac4c915e3d6a498"} Apr 16 19:54:14.316075 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.316054 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" event={"ID":"dea9bf41-88be-4138-b6f1-4334d36c5ca3","Type":"ContainerStarted","Data":"b3dc418310342f61421325899961626a30d096a8ceaba754b32cadb5c8e00388"} Apr 16 19:54:14.332610 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.332564 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-w48w2" podStartSLOduration=2.868347716 podStartE2EDuration="20.332553299s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:53:55.480731919 +0000 UTC m=+1.814113217" lastFinishedPulling="2026-04-16 19:54:12.944937499 +0000 UTC m=+19.278318800" observedRunningTime="2026-04-16 19:54:14.332430317 +0000 UTC m=+20.665811649" watchObservedRunningTime="2026-04-16 19:54:14.332553299 +0000 UTC m=+20.665934619" Apr 16 19:54:14.349639 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.349596 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-k8ssj" podStartSLOduration=2.81645126 podStartE2EDuration="20.349584333s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:53:55.4123373 +0000 UTC m=+1.745718600" lastFinishedPulling="2026-04-16 19:54:12.945470368 +0000 UTC m=+19.278851673" observedRunningTime="2026-04-16 19:54:14.348004989 +0000 UTC m=+20.681386310" watchObservedRunningTime="2026-04-16 19:54:14.349584333 +0000 UTC m=+20.682965652" Apr 16 19:54:14.387325 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.387283 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-z9rpg" podStartSLOduration=2.929343825 podStartE2EDuration="20.38726924s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:53:55.48701427 +0000 UTC m=+1.820395574" lastFinishedPulling="2026-04-16 19:54:12.944939684 +0000 UTC m=+19.278320989" observedRunningTime="2026-04-16 19:54:14.36620422 +0000 UTC m=+20.699585541" watchObservedRunningTime="2026-04-16 19:54:14.38726924 +0000 UTC m=+20.720650561" Apr 16 19:54:14.943320 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:14.943284 2560 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:15.169620 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.169461 2560 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:14.943304181Z","UUID":"fe2c93f1-e793-4b20-88d3-6f1b8f583f0a","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:15.171170 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.171149 2560 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:15.171300 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.171177 2560 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:15.219031 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.218999 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:15.219180 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.218999 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:15.219180 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:15.219155 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:15.219298 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:15.219239 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:15.319771 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.319738 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" event={"ID":"31c74c94218edfc3256dbb69527ce761","Type":"ContainerStarted","Data":"08c4431fb47e8989f3f8d19e710e312b57f0f028ba60493c86e9076d85165ae9"} Apr 16 19:54:15.321470 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.321445 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" event={"ID":"73997f5d-4fb4-4fd0-b12c-ac04bb360d46","Type":"ContainerStarted","Data":"3519b55c74b499189a948d41b58815c346ae185b8fdaafaabd460cba25c88f4a"} Apr 16 19:54:15.335184 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.335135 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-77.ec2.internal" podStartSLOduration=20.335098679 podStartE2EDuration="20.335098679s" podCreationTimestamp="2026-04-16 19:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:15.334224084 +0000 UTC m=+21.667605405" watchObservedRunningTime="2026-04-16 19:54:15.335098679 +0000 UTC m=+21.668480000" Apr 16 19:54:15.662273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.662192 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:54:15.662818 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:15.662797 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:54:16.219369 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:16.219339 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:16.219562 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:16.219449 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:16.326085 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:16.325998 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" event={"ID":"dea9bf41-88be-4138-b6f1-4334d36c5ca3","Type":"ContainerStarted","Data":"f2915d4b8539e094914f69e41e40339eb0d8cb2e66d0128dd092002079aa1e28"} Apr 16 19:54:16.328015 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:16.327979 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" event={"ID":"73997f5d-4fb4-4fd0-b12c-ac04bb360d46","Type":"ContainerStarted","Data":"df59bffaec49af53ffaf678e72ddb83cb83847b7f4ca0bec6b529d73f4a2d9d4"} Apr 16 19:54:16.328332 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:16.328309 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:54:16.328879 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:16.328860 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-k8ssj" Apr 16 19:54:16.361759 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:16.361714 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbbr5" podStartSLOduration=1.983722224 podStartE2EDuration="22.361702455s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:53:55.42755983 +0000 UTC m=+1.760941128" lastFinishedPulling="2026-04-16 19:54:15.805540058 +0000 UTC m=+22.138921359" observedRunningTime="2026-04-16 19:54:16.345721255 +0000 UTC m=+22.679102580" watchObservedRunningTime="2026-04-16 19:54:16.361702455 +0000 UTC m=+22.695083779" Apr 16 19:54:17.218588 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:17.218551 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:17.218765 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:17.218551 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:17.218765 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:17.218691 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:17.218885 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:17.218804 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:18.219394 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:18.219354 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:18.219787 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:18.219496 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:19.218954 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:19.218791 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:19.219137 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:19.218791 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:19.219137 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:19.219032 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:19.219241 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:19.219141 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:19.334250 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:19.334211 2560 generic.go:358] "Generic (PLEG): container finished" podID="2535cb5c-f95c-424b-a266-b74f5c7f4b0b" containerID="0ea89d44d48954b4647a269994d7fd933e46295b56816222d67da4a1adb39d30" exitCode=0 Apr 16 19:54:19.334975 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:19.334258 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s45wn" event={"ID":"2535cb5c-f95c-424b-a266-b74f5c7f4b0b","Type":"ContainerDied","Data":"0ea89d44d48954b4647a269994d7fd933e46295b56816222d67da4a1adb39d30"} Apr 16 19:54:19.337515 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:19.337490 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" event={"ID":"dea9bf41-88be-4138-b6f1-4334d36c5ca3","Type":"ContainerStarted","Data":"eb444a37f4a93745141ba62909731cae6e7f60ae0e3bbf860ea53eff4f9cb6df"} Apr 16 19:54:19.337831 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:19.337815 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:54:19.352283 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:19.352266 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:54:19.418668 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:19.418590 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" podStartSLOduration=7.454928458 podStartE2EDuration="25.418577011s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:53:55.492037149 +0000 UTC m=+1.825418460" lastFinishedPulling="2026-04-16 19:54:13.4556857 +0000 UTC m=+19.789067013" observedRunningTime="2026-04-16 19:54:19.418155696 +0000 UTC m=+25.751537017" watchObservedRunningTime="2026-04-16 19:54:19.418577011 +0000 UTC m=+25.751958350" Apr 16 19:54:19.978542 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:19.978507 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:54:20.219056 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.218821 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:20.219224 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:20.219201 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:20.340951 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.340927 2560 generic.go:358] "Generic (PLEG): container finished" podID="2535cb5c-f95c-424b-a266-b74f5c7f4b0b" containerID="75cdaa95e78f0a5453010e7c2c9e945f6cb34c40928d26c25295beddab078cd7" exitCode=0 Apr 16 19:54:20.341339 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.341013 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s45wn" event={"ID":"2535cb5c-f95c-424b-a266-b74f5c7f4b0b","Type":"ContainerDied","Data":"75cdaa95e78f0a5453010e7c2c9e945f6cb34c40928d26c25295beddab078cd7"} Apr 16 19:54:20.341415 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.341397 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:54:20.354654 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.354635 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:54:20.388007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.387971 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5gp27"] Apr 16 19:54:20.388164 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.388088 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:20.388241 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:20.388221 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:20.391125 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.391092 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nnh4p"] Apr 16 19:54:20.391225 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.391190 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:20.391272 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:20.391259 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:20.405099 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.405074 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vxhp2"] Apr 16 19:54:20.405187 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.405178 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:20.405259 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:20.405241 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:20.472641 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.472612 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4ppnq"] Apr 16 19:54:20.475459 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.475443 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.478302 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.478282 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-j2vdp\"" Apr 16 19:54:20.478421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.478348 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:20.479518 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.479503 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:20.596331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.596254 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5swh\" (UniqueName: \"kubernetes.io/projected/f46654e0-89ad-48e3-ae92-6dec0b5e5d80-kube-api-access-t5swh\") pod \"node-resolver-4ppnq\" (UID: \"f46654e0-89ad-48e3-ae92-6dec0b5e5d80\") " pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.596331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.596298 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f46654e0-89ad-48e3-ae92-6dec0b5e5d80-hosts-file\") pod \"node-resolver-4ppnq\" (UID: \"f46654e0-89ad-48e3-ae92-6dec0b5e5d80\") " pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.596506 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.596333 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f46654e0-89ad-48e3-ae92-6dec0b5e5d80-tmp-dir\") pod \"node-resolver-4ppnq\" (UID: \"f46654e0-89ad-48e3-ae92-6dec0b5e5d80\") " pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.697690 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.697664 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5swh\" (UniqueName: \"kubernetes.io/projected/f46654e0-89ad-48e3-ae92-6dec0b5e5d80-kube-api-access-t5swh\") pod \"node-resolver-4ppnq\" (UID: \"f46654e0-89ad-48e3-ae92-6dec0b5e5d80\") " pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.697856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.697709 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f46654e0-89ad-48e3-ae92-6dec0b5e5d80-hosts-file\") pod \"node-resolver-4ppnq\" (UID: \"f46654e0-89ad-48e3-ae92-6dec0b5e5d80\") " pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.697856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.697787 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f46654e0-89ad-48e3-ae92-6dec0b5e5d80-hosts-file\") pod \"node-resolver-4ppnq\" (UID: \"f46654e0-89ad-48e3-ae92-6dec0b5e5d80\") " pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.697856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.697821 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f46654e0-89ad-48e3-ae92-6dec0b5e5d80-tmp-dir\") pod \"node-resolver-4ppnq\" (UID: \"f46654e0-89ad-48e3-ae92-6dec0b5e5d80\") " pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.698271 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.698245 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f46654e0-89ad-48e3-ae92-6dec0b5e5d80-tmp-dir\") pod \"node-resolver-4ppnq\" (UID: \"f46654e0-89ad-48e3-ae92-6dec0b5e5d80\") " pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.709104 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.709078 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5swh\" (UniqueName: \"kubernetes.io/projected/f46654e0-89ad-48e3-ae92-6dec0b5e5d80-kube-api-access-t5swh\") pod \"node-resolver-4ppnq\" (UID: \"f46654e0-89ad-48e3-ae92-6dec0b5e5d80\") " pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.784026 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:20.783994 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4ppnq" Apr 16 19:54:20.861477 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:54:20.861428 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf46654e0_89ad_48e3_ae92_6dec0b5e5d80.slice/crio-eae96275594c71cd2fcb20b705e627378cdf2cdfec271809d9ac5733ff132917 WatchSource:0}: Error finding container eae96275594c71cd2fcb20b705e627378cdf2cdfec271809d9ac5733ff132917: Status 404 returned error can't find the container with id eae96275594c71cd2fcb20b705e627378cdf2cdfec271809d9ac5733ff132917 Apr 16 19:54:21.345206 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:21.345131 2560 generic.go:358] "Generic (PLEG): container finished" podID="2535cb5c-f95c-424b-a266-b74f5c7f4b0b" containerID="989c989f33e9453e6ba5f321719b1f8d5a1e870af3700bf035782058fc7fef02" exitCode=0 Apr 16 19:54:21.345654 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:21.345208 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s45wn" event={"ID":"2535cb5c-f95c-424b-a266-b74f5c7f4b0b","Type":"ContainerDied","Data":"989c989f33e9453e6ba5f321719b1f8d5a1e870af3700bf035782058fc7fef02"} Apr 16 19:54:21.346588 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:21.346470 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4ppnq" event={"ID":"f46654e0-89ad-48e3-ae92-6dec0b5e5d80","Type":"ContainerStarted","Data":"9c88ffa9ae888f657e7c26417363f5818cd499a5fc090b44c8b8da59301afb4d"} Apr 16 19:54:21.346588 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:21.346506 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4ppnq" event={"ID":"f46654e0-89ad-48e3-ae92-6dec0b5e5d80","Type":"ContainerStarted","Data":"eae96275594c71cd2fcb20b705e627378cdf2cdfec271809d9ac5733ff132917"} Apr 16 19:54:21.382516 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:21.382476 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4ppnq" podStartSLOduration=1.38246377 podStartE2EDuration="1.38246377s" podCreationTimestamp="2026-04-16 19:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:21.382226419 +0000 UTC m=+27.715607740" watchObservedRunningTime="2026-04-16 19:54:21.38246377 +0000 UTC m=+27.715845091" Apr 16 19:54:22.219472 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:22.219444 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:22.219578 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:22.219444 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:22.219641 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:22.219569 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:22.219695 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:22.219663 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:22.219695 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:22.219458 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:22.219796 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:22.219767 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:24.220399 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:24.220177 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:24.220943 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:24.220195 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:24.220943 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:24.220226 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:24.220943 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:24.220564 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:24.220943 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:24.220643 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:24.220943 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:24.220733 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:26.218639 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.218554 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:26.219258 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.218554 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:26.219258 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.218694 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5gp27" podUID="243af810-99f2-40cb-b920-2355426fbf4e" Apr 16 19:54:26.219258 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.218554 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:26.219258 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.218786 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnh4p" podUID="e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0" Apr 16 19:54:26.219258 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.218852 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vxhp2" podUID="e5be099e-d9c4-4a29-af14-f803d80a9636" Apr 16 19:54:26.466575 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.466544 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-77.ec2.internal" event="NodeReady" Apr 16 19:54:26.466728 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.466693 2560 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:26.507149 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.507064 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67cbbd8898-vr2dm"] Apr 16 19:54:26.542596 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.542555 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kbg5k"] Apr 16 19:54:26.542772 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.542641 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.545588 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.545564 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:54:26.545768 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.545590 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfjh6\"" Apr 16 19:54:26.545869 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.545849 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:54:26.546100 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.546083 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:54:26.550335 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.550072 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:54:26.557616 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.557594 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t7r5g"] Apr 16 19:54:26.557724 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.557640 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.561809 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.561791 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:26.561893 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.561827 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:26.562061 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.562047 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lx244\"" Apr 16 19:54:26.580501 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.580476 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67cbbd8898-vr2dm"] Apr 16 19:54:26.580621 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.580508 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:26.580621 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.580516 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kbg5k"] Apr 16 19:54:26.580732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.580642 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t7r5g"] Apr 16 19:54:26.584031 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.584009 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:26.584186 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.584164 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:26.584301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.584164 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6lbj8\"" Apr 16 19:54:26.584301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.584269 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:26.643345 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643313 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28a8b516-10dc-45a7-9ab7-f91fcd27a842-config-volume\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.643509 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643355 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28a8b516-10dc-45a7-9ab7-f91fcd27a842-tmp-dir\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.643509 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643379 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmqsg\" (UniqueName: \"kubernetes.io/projected/28a8b516-10dc-45a7-9ab7-f91fcd27a842-kube-api-access-fmqsg\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.643509 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643417 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.643509 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643478 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-image-registry-private-configuration\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.643672 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643530 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.643672 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643581 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-trusted-ca\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.643672 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643629 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-certificates\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.643672 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643654 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-installation-pull-secrets\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.643845 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643716 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/01daca45-c3aa-4353-91ac-68cb75bf0890-ca-trust-extracted\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.643845 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643758 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvklv\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-kube-api-access-dvklv\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.643845 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.643793 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-bound-sa-token\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.744700 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.744656 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28a8b516-10dc-45a7-9ab7-f91fcd27a842-config-volume\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.744872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.744718 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28a8b516-10dc-45a7-9ab7-f91fcd27a842-tmp-dir\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.744872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.744856 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqsg\" (UniqueName: \"kubernetes.io/projected/28a8b516-10dc-45a7-9ab7-f91fcd27a842-kube-api-access-fmqsg\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.744988 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.744913 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.744988 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.744952 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-image-registry-private-configuration\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.744988 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.744982 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.745266 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-trusted-ca\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.745243 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28a8b516-10dc-45a7-9ab7-f91fcd27a842-tmp-dir\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.745348 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-certificates\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.745485 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.745501 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbbd8898-vr2dm: secret "image-registry-tls" not found Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.745526 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-installation-pull-secrets\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.745564 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28a8b516-10dc-45a7-9ab7-f91fcd27a842-config-volume\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.745573 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/01daca45-c3aa-4353-91ac-68cb75bf0890-ca-trust-extracted\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.745578 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.745594 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls podName:01daca45-c3aa-4353-91ac-68cb75bf0890 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.245568709 +0000 UTC m=+33.578950028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls") pod "image-registry-67cbbd8898-vr2dm" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890") : secret "image-registry-tls" not found Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.745639 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvklv\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-kube-api-access-dvklv\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.746165 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.745685 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-bound-sa-token\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.746808 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.746234 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls podName:28a8b516-10dc-45a7-9ab7-f91fcd27a842 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.246204089 +0000 UTC m=+33.579585396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls") pod "dns-default-kbg5k" (UID: "28a8b516-10dc-45a7-9ab7-f91fcd27a842") : secret "dns-default-metrics-tls" not found Apr 16 19:54:26.746808 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.746264 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/01daca45-c3aa-4353-91ac-68cb75bf0890-ca-trust-extracted\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.746808 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.746390 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-certificates\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.746808 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.746512 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96f76\" (UniqueName: \"kubernetes.io/projected/8b2bac48-99a7-47ac-b46a-269204d0bfe5-kube-api-access-96f76\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:26.746808 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.746682 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:26.749479 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.749391 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-trusted-ca\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.750781 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.750732 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-installation-pull-secrets\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.751096 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.751073 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-image-registry-private-configuration\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.754917 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.754896 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmqsg\" (UniqueName: \"kubernetes.io/projected/28a8b516-10dc-45a7-9ab7-f91fcd27a842-kube-api-access-fmqsg\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:26.755406 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.755384 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-bound-sa-token\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.755487 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.755411 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvklv\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-kube-api-access-dvklv\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:26.847983 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.847890 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96f76\" (UniqueName: \"kubernetes.io/projected/8b2bac48-99a7-47ac-b46a-269204d0bfe5-kube-api-access-96f76\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:26.847983 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.847948 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:26.848234 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.848010 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:26.848234 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.848125 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:26.848234 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.848178 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:26.848234 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.848216 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert podName:8b2bac48-99a7-47ac-b46a-269204d0bfe5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.348200129 +0000 UTC m=+33.681581444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert") pod "ingress-canary-t7r5g" (UID: "8b2bac48-99a7-47ac-b46a-269204d0bfe5") : secret "canary-serving-cert" not found Apr 16 19:54:26.848455 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.848239 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs podName:e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:58.848223877 +0000 UTC m=+65.181605178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs") pod "network-metrics-daemon-nnh4p" (UID: "e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:26.859213 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.859188 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96f76\" (UniqueName: \"kubernetes.io/projected/8b2bac48-99a7-47ac-b46a-269204d0bfe5-kube-api-access-96f76\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:26.948721 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:26.948674 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:26.948893 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.948857 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:26.948893 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.948883 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:26.949014 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.948898 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7lx2j for pod openshift-network-diagnostics/network-check-target-vxhp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:26.949014 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:26.948958 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j podName:e5be099e-d9c4-4a29-af14-f803d80a9636 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:58.948938744 +0000 UTC m=+65.282320051 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lx2j" (UniqueName: "kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j") pod "network-check-target-vxhp2" (UID: "e5be099e-d9c4-4a29-af14-f803d80a9636") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:27.251995 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:27.251925 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:27.251995 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:27.251971 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:27.252466 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:27.252057 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:27.252466 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:27.252068 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbbd8898-vr2dm: secret "image-registry-tls" not found Apr 16 19:54:27.252466 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:27.252129 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls podName:01daca45-c3aa-4353-91ac-68cb75bf0890 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:28.252096277 +0000 UTC m=+34.585477576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls") pod "image-registry-67cbbd8898-vr2dm" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890") : secret "image-registry-tls" not found Apr 16 19:54:27.252466 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:27.252169 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:27.252466 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:27.252219 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls podName:28a8b516-10dc-45a7-9ab7-f91fcd27a842 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:28.252207962 +0000 UTC m=+34.585589261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls") pod "dns-default-kbg5k" (UID: "28a8b516-10dc-45a7-9ab7-f91fcd27a842") : secret "dns-default-metrics-tls" not found Apr 16 19:54:27.352639 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:27.352612 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:27.352772 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:27.352747 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:27.352811 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:27.352804 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert podName:8b2bac48-99a7-47ac-b46a-269204d0bfe5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:28.352790042 +0000 UTC m=+34.686171341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert") pod "ingress-canary-t7r5g" (UID: "8b2bac48-99a7-47ac-b46a-269204d0bfe5") : secret "canary-serving-cert" not found Apr 16 19:54:28.219390 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.219304 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:28.219533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.219307 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:28.219533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.219307 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:28.223522 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.223494 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.223644 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.223534 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-25nwv\"" Apr 16 19:54:28.223644 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.223573 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.223644 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.223573 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2m6x\"" Apr 16 19:54:28.223644 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.223619 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:54:28.223644 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.223576 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:28.261135 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.261096 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:28.261135 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.261142 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:28.261455 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:28.261250 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:28.261455 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:28.261302 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls podName:28a8b516-10dc-45a7-9ab7-f91fcd27a842 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.26128777 +0000 UTC m=+36.594669072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls") pod "dns-default-kbg5k" (UID: "28a8b516-10dc-45a7-9ab7-f91fcd27a842") : secret "dns-default-metrics-tls" not found Apr 16 19:54:28.261455 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:28.261253 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:28.261455 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:28.261341 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbbd8898-vr2dm: secret "image-registry-tls" not found Apr 16 19:54:28.261455 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:28.261396 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls podName:01daca45-c3aa-4353-91ac-68cb75bf0890 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.261384117 +0000 UTC m=+36.594765421 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls") pod "image-registry-67cbbd8898-vr2dm" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890") : secret "image-registry-tls" not found Apr 16 19:54:28.360643 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.360613 2560 generic.go:358] "Generic (PLEG): container finished" podID="2535cb5c-f95c-424b-a266-b74f5c7f4b0b" containerID="31d2dc7f11636cafefd62c2ac9512af99571afafac0cbaa9c23b540d0b58aed6" exitCode=0 Apr 16 19:54:28.360817 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.360676 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s45wn" event={"ID":"2535cb5c-f95c-424b-a266-b74f5c7f4b0b","Type":"ContainerDied","Data":"31d2dc7f11636cafefd62c2ac9512af99571afafac0cbaa9c23b540d0b58aed6"} Apr 16 19:54:28.362417 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.362401 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:28.362574 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:28.362556 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:28.362636 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:28.362626 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert podName:8b2bac48-99a7-47ac-b46a-269204d0bfe5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.362605769 +0000 UTC m=+36.695987086 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert") pod "ingress-canary-t7r5g" (UID: "8b2bac48-99a7-47ac-b46a-269204d0bfe5") : secret "canary-serving-cert" not found Apr 16 19:54:28.463039 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.463008 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:28.465096 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.465075 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/243af810-99f2-40cb-b920-2355426fbf4e-original-pull-secret\") pod \"global-pull-secret-syncer-5gp27\" (UID: \"243af810-99f2-40cb-b920-2355426fbf4e\") " pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:28.528786 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.528735 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5gp27" Apr 16 19:54:28.699745 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:28.699716 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5gp27"] Apr 16 19:54:28.711764 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:54:28.711732 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod243af810_99f2_40cb_b920_2355426fbf4e.slice/crio-ae09083e34ae3060bce6be3fe76874033e6440460f9878d6ac88c8f98b2461e5 WatchSource:0}: Error finding container ae09083e34ae3060bce6be3fe76874033e6440460f9878d6ac88c8f98b2461e5: Status 404 returned error can't find the container with id ae09083e34ae3060bce6be3fe76874033e6440460f9878d6ac88c8f98b2461e5 Apr 16 19:54:29.365614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:29.365536 2560 generic.go:358] "Generic (PLEG): container finished" podID="2535cb5c-f95c-424b-a266-b74f5c7f4b0b" containerID="70df87ce8f565a82b516c672deabad4d723228f841f37b081562ed85937d4f68" exitCode=0 Apr 16 19:54:29.366016 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:29.365619 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s45wn" event={"ID":"2535cb5c-f95c-424b-a266-b74f5c7f4b0b","Type":"ContainerDied","Data":"70df87ce8f565a82b516c672deabad4d723228f841f37b081562ed85937d4f68"} Apr 16 19:54:29.366792 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:29.366769 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5gp27" event={"ID":"243af810-99f2-40cb-b920-2355426fbf4e","Type":"ContainerStarted","Data":"ae09083e34ae3060bce6be3fe76874033e6440460f9878d6ac88c8f98b2461e5"} Apr 16 19:54:30.275854 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:30.275774 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:30.275854 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:30.275828 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:30.276070 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:30.275926 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:30.276070 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:30.275937 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:30.276070 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:30.275948 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbbd8898-vr2dm: secret "image-registry-tls" not found Apr 16 19:54:30.276070 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:30.275992 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls podName:01daca45-c3aa-4353-91ac-68cb75bf0890 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:34.275974833 +0000 UTC m=+40.609356145 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls") pod "image-registry-67cbbd8898-vr2dm" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890") : secret "image-registry-tls" not found Apr 16 19:54:30.276070 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:30.276009 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls podName:28a8b516-10dc-45a7-9ab7-f91fcd27a842 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:34.276000495 +0000 UTC m=+40.609381794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls") pod "dns-default-kbg5k" (UID: "28a8b516-10dc-45a7-9ab7-f91fcd27a842") : secret "dns-default-metrics-tls" not found Apr 16 19:54:30.373361 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:30.373323 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s45wn" event={"ID":"2535cb5c-f95c-424b-a266-b74f5c7f4b0b","Type":"ContainerStarted","Data":"8ae63da2a45382387124ce259647c0d3b239cbb08b9c2460090b4b60cf7f20c0"} Apr 16 19:54:30.376939 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:30.376918 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:30.377085 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:30.377026 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:30.377085 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:30.377082 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert podName:8b2bac48-99a7-47ac-b46a-269204d0bfe5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:34.377064438 +0000 UTC m=+40.710445740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert") pod "ingress-canary-t7r5g" (UID: "8b2bac48-99a7-47ac-b46a-269204d0bfe5") : secret "canary-serving-cert" not found Apr 16 19:54:30.404992 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:30.404947 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-s45wn" podStartSLOduration=4.219604399 podStartE2EDuration="36.40493328s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:53:55.441007214 +0000 UTC m=+1.774388513" lastFinishedPulling="2026-04-16 19:54:27.626336092 +0000 UTC m=+33.959717394" observedRunningTime="2026-04-16 19:54:30.403425771 +0000 UTC m=+36.736807104" watchObservedRunningTime="2026-04-16 19:54:30.40493328 +0000 UTC m=+36.738314600" Apr 16 19:54:33.380919 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:33.380820 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5gp27" event={"ID":"243af810-99f2-40cb-b920-2355426fbf4e","Type":"ContainerStarted","Data":"2d0dedd4190da6717328f477efc1d4fcc72ea514014cb4443a399e67ccb23534"} Apr 16 19:54:33.400972 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:33.400923 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5gp27" podStartSLOduration=33.194586491 podStartE2EDuration="37.400908599s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:54:28.713351252 +0000 UTC m=+35.046732551" lastFinishedPulling="2026-04-16 19:54:32.919673357 +0000 UTC m=+39.253054659" observedRunningTime="2026-04-16 19:54:33.398755485 +0000 UTC m=+39.732136806" watchObservedRunningTime="2026-04-16 19:54:33.400908599 +0000 UTC m=+39.734289919" Apr 16 19:54:34.306570 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:34.306527 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:34.306764 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:34.306576 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:34.306764 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:34.306686 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:34.306764 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:34.306702 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbbd8898-vr2dm: secret "image-registry-tls" not found Apr 16 19:54:34.306764 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:34.306755 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls podName:01daca45-c3aa-4353-91ac-68cb75bf0890 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:42.306737882 +0000 UTC m=+48.640119194 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls") pod "image-registry-67cbbd8898-vr2dm" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890") : secret "image-registry-tls" not found Apr 16 19:54:34.306987 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:34.306687 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:34.306987 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:34.306835 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls podName:28a8b516-10dc-45a7-9ab7-f91fcd27a842 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:42.306817478 +0000 UTC m=+48.640198783 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls") pod "dns-default-kbg5k" (UID: "28a8b516-10dc-45a7-9ab7-f91fcd27a842") : secret "dns-default-metrics-tls" not found Apr 16 19:54:34.407416 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:34.407384 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:34.407776 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:34.407496 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:34.407776 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:34.407547 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert podName:8b2bac48-99a7-47ac-b46a-269204d0bfe5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:42.407531309 +0000 UTC m=+48.740912625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert") pod "ingress-canary-t7r5g" (UID: "8b2bac48-99a7-47ac-b46a-269204d0bfe5") : secret "canary-serving-cert" not found Apr 16 19:54:42.362464 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:42.362419 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:42.362464 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:42.362466 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:42.362967 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:42.362563 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:42.362967 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:42.362620 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls podName:28a8b516-10dc-45a7-9ab7-f91fcd27a842 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:58.362604553 +0000 UTC m=+64.695985851 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls") pod "dns-default-kbg5k" (UID: "28a8b516-10dc-45a7-9ab7-f91fcd27a842") : secret "dns-default-metrics-tls" not found Apr 16 19:54:42.362967 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:42.362673 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:42.362967 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:42.362689 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67cbbd8898-vr2dm: secret "image-registry-tls" not found Apr 16 19:54:42.362967 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:42.362743 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls podName:01daca45-c3aa-4353-91ac-68cb75bf0890 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:58.362727719 +0000 UTC m=+64.696109022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls") pod "image-registry-67cbbd8898-vr2dm" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890") : secret "image-registry-tls" not found Apr 16 19:54:42.463573 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:42.463536 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:42.463744 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:42.463651 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:42.463744 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:54:42.463719 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert podName:8b2bac48-99a7-47ac-b46a-269204d0bfe5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:58.463698019 +0000 UTC m=+64.797079322 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert") pod "ingress-canary-t7r5g" (UID: "8b2bac48-99a7-47ac-b46a-269204d0bfe5") : secret "canary-serving-cert" not found Apr 16 19:54:47.713043 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:47.713015 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4ppnq_f46654e0-89ad-48e3-ae92-6dec0b5e5d80/dns-node-resolver/0.log" Apr 16 19:54:49.115447 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:49.115420 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w48w2_038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8/node-ca/0.log" Apr 16 19:54:52.358421 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:52.358391 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29x9d" Apr 16 19:54:58.374672 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.374627 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:58.374672 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.374674 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:58.378158 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.378138 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28a8b516-10dc-45a7-9ab7-f91fcd27a842-metrics-tls\") pod \"dns-default-kbg5k\" (UID: \"28a8b516-10dc-45a7-9ab7-f91fcd27a842\") " pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:58.388378 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.388349 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"image-registry-67cbbd8898-vr2dm\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:58.475845 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.475810 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:58.478163 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.478140 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b2bac48-99a7-47ac-b46a-269204d0bfe5-cert\") pod \"ingress-canary-t7r5g\" (UID: \"8b2bac48-99a7-47ac-b46a-269204d0bfe5\") " pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:58.657442 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.657370 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfjh6\"" Apr 16 19:54:58.665881 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.665865 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:58.670125 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.670094 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lx244\"" Apr 16 19:54:58.677630 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.677609 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kbg5k" Apr 16 19:54:58.692923 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.692901 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6lbj8\"" Apr 16 19:54:58.701426 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.701386 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t7r5g" Apr 16 19:54:58.802551 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.802507 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67cbbd8898-vr2dm"] Apr 16 19:54:58.806989 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:54:58.806957 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01daca45_c3aa_4353_91ac_68cb75bf0890.slice/crio-4d8f60ced0d8b67670b73bcb33b8067356fce97f055fafc49ce6598cc8cbabb2 WatchSource:0}: Error finding container 4d8f60ced0d8b67670b73bcb33b8067356fce97f055fafc49ce6598cc8cbabb2: Status 404 returned error can't find the container with id 4d8f60ced0d8b67670b73bcb33b8067356fce97f055fafc49ce6598cc8cbabb2 Apr 16 19:54:58.815836 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.815809 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kbg5k"] Apr 16 19:54:58.818272 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:54:58.818249 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a8b516_10dc_45a7_9ab7_f91fcd27a842.slice/crio-2aa8f6120d62243af0fb65178d004d59df8325b93a62f2473e4f7a6572d3397b WatchSource:0}: Error finding container 2aa8f6120d62243af0fb65178d004d59df8325b93a62f2473e4f7a6572d3397b: Status 404 returned error can't find the container with id 2aa8f6120d62243af0fb65178d004d59df8325b93a62f2473e4f7a6572d3397b Apr 16 19:54:58.840910 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.840893 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t7r5g"] Apr 16 19:54:58.842868 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:54:58.842841 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b2bac48_99a7_47ac_b46a_269204d0bfe5.slice/crio-d770a848d5ce7e96f6b6fa05ff0a0e4a758fca7d40ef13cd84996d30b8fcf688 WatchSource:0}: Error finding container d770a848d5ce7e96f6b6fa05ff0a0e4a758fca7d40ef13cd84996d30b8fcf688: Status 404 returned error can't find the container with id d770a848d5ce7e96f6b6fa05ff0a0e4a758fca7d40ef13cd84996d30b8fcf688 Apr 16 19:54:58.879026 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.879001 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:58.883829 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.883804 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:58.891158 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.891137 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0-metrics-certs\") pod \"network-metrics-daemon-nnh4p\" (UID: \"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0\") " pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:58.980067 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.980041 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:58.992731 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:58.992705 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:59.003639 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.003615 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:59.014061 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.014038 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lx2j\" (UniqueName: \"kubernetes.io/projected/e5be099e-d9c4-4a29-af14-f803d80a9636-kube-api-access-7lx2j\") pod \"network-check-target-vxhp2\" (UID: \"e5be099e-d9c4-4a29-af14-f803d80a9636\") " pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:59.137147 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.137099 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-25nwv\"" Apr 16 19:54:59.141754 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.141739 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2m6x\"" Apr 16 19:54:59.145246 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.145231 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:54:59.149850 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.149827 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnh4p" Apr 16 19:54:59.275571 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.275538 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vxhp2"] Apr 16 19:54:59.286060 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:54:59.286030 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5be099e_d9c4_4a29_af14_f803d80a9636.slice/crio-f580a205e6a6d79fe8a1e42bbc44ce362de60ae6f2b684fb8a61485e681386d1 WatchSource:0}: Error finding container f580a205e6a6d79fe8a1e42bbc44ce362de60ae6f2b684fb8a61485e681386d1: Status 404 returned error can't find the container with id f580a205e6a6d79fe8a1e42bbc44ce362de60ae6f2b684fb8a61485e681386d1 Apr 16 19:54:59.289841 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.289818 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nnh4p"] Apr 16 19:54:59.292955 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:54:59.292928 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6dd25f1_c4b9_4e8a_9a38_70bb41d1ded0.slice/crio-1ee746c5de15cf6779d3c19d5d08163eed53a9acf0c14b13ad83cbace60d9dab WatchSource:0}: Error finding container 1ee746c5de15cf6779d3c19d5d08163eed53a9acf0c14b13ad83cbace60d9dab: Status 404 returned error can't find the container with id 1ee746c5de15cf6779d3c19d5d08163eed53a9acf0c14b13ad83cbace60d9dab Apr 16 19:54:59.434091 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.434033 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vxhp2" event={"ID":"e5be099e-d9c4-4a29-af14-f803d80a9636","Type":"ContainerStarted","Data":"f580a205e6a6d79fe8a1e42bbc44ce362de60ae6f2b684fb8a61485e681386d1"} Apr 16 19:54:59.435669 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.435640 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t7r5g" event={"ID":"8b2bac48-99a7-47ac-b46a-269204d0bfe5","Type":"ContainerStarted","Data":"d770a848d5ce7e96f6b6fa05ff0a0e4a758fca7d40ef13cd84996d30b8fcf688"} Apr 16 19:54:59.436838 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.436808 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kbg5k" event={"ID":"28a8b516-10dc-45a7-9ab7-f91fcd27a842","Type":"ContainerStarted","Data":"2aa8f6120d62243af0fb65178d004d59df8325b93a62f2473e4f7a6572d3397b"} Apr 16 19:54:59.438390 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.438367 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" event={"ID":"01daca45-c3aa-4353-91ac-68cb75bf0890","Type":"ContainerStarted","Data":"1299cb067e664737973d957edc44bdada5195b541906915aee4ac90d7682b09e"} Apr 16 19:54:59.438504 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.438397 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" event={"ID":"01daca45-c3aa-4353-91ac-68cb75bf0890","Type":"ContainerStarted","Data":"4d8f60ced0d8b67670b73bcb33b8067356fce97f055fafc49ce6598cc8cbabb2"} Apr 16 19:54:59.438504 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.438460 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:54:59.439476 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.439447 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nnh4p" event={"ID":"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0","Type":"ContainerStarted","Data":"1ee746c5de15cf6779d3c19d5d08163eed53a9acf0c14b13ad83cbace60d9dab"} Apr 16 19:54:59.461180 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:54:59.460058 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" podStartSLOduration=60.460040931 podStartE2EDuration="1m0.460040931s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:59.459684047 +0000 UTC m=+65.793065367" watchObservedRunningTime="2026-04-16 19:54:59.460040931 +0000 UTC m=+65.793422254" Apr 16 19:55:02.457059 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:02.457026 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t7r5g" event={"ID":"8b2bac48-99a7-47ac-b46a-269204d0bfe5","Type":"ContainerStarted","Data":"ebf6032e19d43b51ff2ba9cc3d6013d8fa84cf258d4754bfee308bf73511a9a6"} Apr 16 19:55:02.475126 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:02.475068 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t7r5g" podStartSLOduration=33.004657444 podStartE2EDuration="36.475049304s" podCreationTimestamp="2026-04-16 19:54:26 +0000 UTC" firstStartedPulling="2026-04-16 19:54:58.84448459 +0000 UTC m=+65.177865889" lastFinishedPulling="2026-04-16 19:55:02.314876444 +0000 UTC m=+68.648257749" observedRunningTime="2026-04-16 19:55:02.474954927 +0000 UTC m=+68.808336250" watchObservedRunningTime="2026-04-16 19:55:02.475049304 +0000 UTC m=+68.808430626" Apr 16 19:55:03.466075 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:03.466041 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nnh4p" event={"ID":"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0","Type":"ContainerStarted","Data":"04eea03653412d82743d50fbdb7efce05a7ec2118257c797d27389805ce77c58"} Apr 16 19:55:03.466075 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:03.466079 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nnh4p" event={"ID":"e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0","Type":"ContainerStarted","Data":"2e1b114879a2794f8d84075ae2c0190b485b18a22a6666a9394eb7cf4de8d025"} Apr 16 19:55:03.467379 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:03.467355 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vxhp2" event={"ID":"e5be099e-d9c4-4a29-af14-f803d80a9636","Type":"ContainerStarted","Data":"9d51b4dca7c559b2e681b01a7a18b756061640ccc6919bf441eaf3df74da1b93"} Apr 16 19:55:03.467499 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:03.467483 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:55:03.468869 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:03.468846 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kbg5k" event={"ID":"28a8b516-10dc-45a7-9ab7-f91fcd27a842","Type":"ContainerStarted","Data":"d05be107b54abf239be337d49e511b7e0d8527f8ce3c989e43a648ca1affebec"} Apr 16 19:55:03.468953 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:03.468875 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kbg5k" event={"ID":"28a8b516-10dc-45a7-9ab7-f91fcd27a842","Type":"ContainerStarted","Data":"79aefa1b76edfc3ccdf4b11936c52145af25da92235e984895a7b85283dd900d"} Apr 16 19:55:03.469001 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:03.468984 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kbg5k" Apr 16 19:55:03.486203 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:03.486163 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nnh4p" podStartSLOduration=66.471488199 podStartE2EDuration="1m9.486152225s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:54:59.294797475 +0000 UTC m=+65.628178777" lastFinishedPulling="2026-04-16 19:55:02.3094615 +0000 UTC m=+68.642842803" observedRunningTime="2026-04-16 19:55:03.484962231 +0000 UTC m=+69.818343552" watchObservedRunningTime="2026-04-16 19:55:03.486152225 +0000 UTC m=+69.819533570" Apr 16 19:55:03.507056 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:03.507016 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vxhp2" podStartSLOduration=66.425914951 podStartE2EDuration="1m9.507005042s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:54:59.288102785 +0000 UTC m=+65.621484084" lastFinishedPulling="2026-04-16 19:55:02.369192859 +0000 UTC m=+68.702574175" observedRunningTime="2026-04-16 19:55:03.506669864 +0000 UTC m=+69.840051185" watchObservedRunningTime="2026-04-16 19:55:03.507005042 +0000 UTC m=+69.840386362" Apr 16 19:55:08.405288 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.405164 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kbg5k" podStartSLOduration=38.917129488 podStartE2EDuration="42.405147877s" podCreationTimestamp="2026-04-16 19:54:26 +0000 UTC" firstStartedPulling="2026-04-16 19:54:58.8201612 +0000 UTC m=+65.153542500" lastFinishedPulling="2026-04-16 19:55:02.30817959 +0000 UTC m=+68.641560889" observedRunningTime="2026-04-16 19:55:03.530209597 +0000 UTC m=+69.863590918" watchObservedRunningTime="2026-04-16 19:55:08.405147877 +0000 UTC m=+74.738529198" Apr 16 19:55:08.405838 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.405790 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67cbbd8898-vr2dm"] Apr 16 19:55:08.473014 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.472982 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qxbw4"] Apr 16 19:55:08.477972 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.477955 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.480837 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.480811 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:55:08.481211 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.481192 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:55:08.481320 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.481221 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vh9v5\"" Apr 16 19:55:08.481380 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.481347 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:55:08.481427 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.481373 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:55:08.494917 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.494890 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qxbw4"] Apr 16 19:55:08.541231 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.541201 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74c4c920-cd1c-4a00-acef-32d4f5377828-data-volume\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.541362 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.541238 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74c4c920-cd1c-4a00-acef-32d4f5377828-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.541362 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.541263 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74c4c920-cd1c-4a00-acef-32d4f5377828-crio-socket\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.541435 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.541355 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74c4c920-cd1c-4a00-acef-32d4f5377828-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.541435 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.541397 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thbtn\" (UniqueName: \"kubernetes.io/projected/74c4c920-cd1c-4a00-acef-32d4f5377828-kube-api-access-thbtn\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.642647 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.642616 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74c4c920-cd1c-4a00-acef-32d4f5377828-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.642792 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.642657 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74c4c920-cd1c-4a00-acef-32d4f5377828-crio-socket\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.642792 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.642740 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74c4c920-cd1c-4a00-acef-32d4f5377828-crio-socket\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.642792 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.642776 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74c4c920-cd1c-4a00-acef-32d4f5377828-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.642921 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.642908 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thbtn\" (UniqueName: \"kubernetes.io/projected/74c4c920-cd1c-4a00-acef-32d4f5377828-kube-api-access-thbtn\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.642956 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.642941 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74c4c920-cd1c-4a00-acef-32d4f5377828-data-volume\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.643222 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.643205 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74c4c920-cd1c-4a00-acef-32d4f5377828-data-volume\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.643861 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.643839 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74c4c920-cd1c-4a00-acef-32d4f5377828-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.644962 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.644943 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74c4c920-cd1c-4a00-acef-32d4f5377828-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.654088 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.654067 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thbtn\" (UniqueName: \"kubernetes.io/projected/74c4c920-cd1c-4a00-acef-32d4f5377828-kube-api-access-thbtn\") pod \"insights-runtime-extractor-qxbw4\" (UID: \"74c4c920-cd1c-4a00-acef-32d4f5377828\") " pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.787404 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.787371 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qxbw4" Apr 16 19:55:08.901442 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:08.901417 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qxbw4"] Apr 16 19:55:08.903592 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:08.903569 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c4c920_cd1c_4a00_acef_32d4f5377828.slice/crio-1bae612cedb990ad9afd1713fab865d0fbc4a1a6a48b174fea4dfd8caf2b426d WatchSource:0}: Error finding container 1bae612cedb990ad9afd1713fab865d0fbc4a1a6a48b174fea4dfd8caf2b426d: Status 404 returned error can't find the container with id 1bae612cedb990ad9afd1713fab865d0fbc4a1a6a48b174fea4dfd8caf2b426d Apr 16 19:55:09.485369 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:09.485332 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qxbw4" event={"ID":"74c4c920-cd1c-4a00-acef-32d4f5377828","Type":"ContainerStarted","Data":"b30170934207c4ced9fbe77567039a3bc6b0698f1f89e82f60b0b7918e89b524"} Apr 16 19:55:09.485369 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:09.485373 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qxbw4" event={"ID":"74c4c920-cd1c-4a00-acef-32d4f5377828","Type":"ContainerStarted","Data":"1bae612cedb990ad9afd1713fab865d0fbc4a1a6a48b174fea4dfd8caf2b426d"} Apr 16 19:55:10.489282 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:10.489247 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qxbw4" event={"ID":"74c4c920-cd1c-4a00-acef-32d4f5377828","Type":"ContainerStarted","Data":"56f5f82b0efbf826f18c35617591f36011e49af876078ee1c5dd594fe0cad08d"} Apr 16 19:55:11.493584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:11.493547 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qxbw4" event={"ID":"74c4c920-cd1c-4a00-acef-32d4f5377828","Type":"ContainerStarted","Data":"7a8b8525745493eb83a7e782549f4b60496b4dedcd83579833c5d0cc47cf18bd"} Apr 16 19:55:11.514738 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:11.514691 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qxbw4" podStartSLOduration=1.557056258 podStartE2EDuration="3.514678694s" podCreationTimestamp="2026-04-16 19:55:08 +0000 UTC" firstStartedPulling="2026-04-16 19:55:08.958798006 +0000 UTC m=+75.292179305" lastFinishedPulling="2026-04-16 19:55:10.91642044 +0000 UTC m=+77.249801741" observedRunningTime="2026-04-16 19:55:11.513033964 +0000 UTC m=+77.846415286" watchObservedRunningTime="2026-04-16 19:55:11.514678694 +0000 UTC m=+77.848060015" Apr 16 19:55:13.473751 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:13.473645 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kbg5k" Apr 16 19:55:17.170588 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.170550 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-5gz5z"] Apr 16 19:55:17.174884 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.174861 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-5gz5z" Apr 16 19:55:17.177598 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.177578 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:55:17.177719 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.177696 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4ns9h\"" Apr 16 19:55:17.177766 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.177732 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:55:17.194425 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.194405 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-5gz5z"] Apr 16 19:55:17.299156 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.299097 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvqfw\" (UniqueName: \"kubernetes.io/projected/2fcbb9c9-45d1-42ca-ad88-df53ea6940b7-kube-api-access-mvqfw\") pod \"downloads-6bcc868b7-5gz5z\" (UID: \"2fcbb9c9-45d1-42ca-ad88-df53ea6940b7\") " pod="openshift-console/downloads-6bcc868b7-5gz5z" Apr 16 19:55:17.400359 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.400315 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvqfw\" (UniqueName: \"kubernetes.io/projected/2fcbb9c9-45d1-42ca-ad88-df53ea6940b7-kube-api-access-mvqfw\") pod \"downloads-6bcc868b7-5gz5z\" (UID: \"2fcbb9c9-45d1-42ca-ad88-df53ea6940b7\") " pod="openshift-console/downloads-6bcc868b7-5gz5z" Apr 16 19:55:17.408438 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.408417 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvqfw\" (UniqueName: \"kubernetes.io/projected/2fcbb9c9-45d1-42ca-ad88-df53ea6940b7-kube-api-access-mvqfw\") pod \"downloads-6bcc868b7-5gz5z\" (UID: \"2fcbb9c9-45d1-42ca-ad88-df53ea6940b7\") " pod="openshift-console/downloads-6bcc868b7-5gz5z" Apr 16 19:55:17.483393 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.483321 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-5gz5z" Apr 16 19:55:17.595978 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:17.595939 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-5gz5z"] Apr 16 19:55:17.599631 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:17.599605 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fcbb9c9_45d1_42ca_ad88_df53ea6940b7.slice/crio-1e24b0bed1dab6d0eaaa94793753c57b96607ff5f45ba19c6873c85091b10cd8 WatchSource:0}: Error finding container 1e24b0bed1dab6d0eaaa94793753c57b96607ff5f45ba19c6873c85091b10cd8: Status 404 returned error can't find the container with id 1e24b0bed1dab6d0eaaa94793753c57b96607ff5f45ba19c6873c85091b10cd8 Apr 16 19:55:18.412362 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:18.412321 2560 patch_prober.go:28] interesting pod/image-registry-67cbbd8898-vr2dm container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 19:55:18.412778 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:18.412393 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" podUID="01daca45-c3aa-4353-91ac-68cb75bf0890" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:55:18.512551 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:18.512517 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-5gz5z" event={"ID":"2fcbb9c9-45d1-42ca-ad88-df53ea6940b7","Type":"ContainerStarted","Data":"1e24b0bed1dab6d0eaaa94793753c57b96607ff5f45ba19c6873c85091b10cd8"} Apr 16 19:55:28.411205 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:28.411173 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:55:31.848065 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:31.848029 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd"] Apr 16 19:55:31.852428 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:31.852405 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" Apr 16 19:55:31.855097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:31.855070 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 19:55:31.855385 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:31.855366 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-bxbwc\"" Apr 16 19:55:31.858254 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:31.858230 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd"] Apr 16 19:55:32.007614 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:32.007572 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/92ba2761-16cb-4de3-9015-696acbf9c5ae-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-tfrkd\" (UID: \"92ba2761-16cb-4de3-9015-696acbf9c5ae\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" Apr 16 19:55:32.108770 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:32.108690 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/92ba2761-16cb-4de3-9015-696acbf9c5ae-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-tfrkd\" (UID: \"92ba2761-16cb-4de3-9015-696acbf9c5ae\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" Apr 16 19:55:32.108933 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:55:32.108833 2560 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 19:55:32.108933 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:55:32.108918 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92ba2761-16cb-4de3-9015-696acbf9c5ae-tls-certificates podName:92ba2761-16cb-4de3-9015-696acbf9c5ae nodeName:}" failed. No retries permitted until 2026-04-16 19:55:32.60889404 +0000 UTC m=+98.942275362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/92ba2761-16cb-4de3-9015-696acbf9c5ae-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-tfrkd" (UID: "92ba2761-16cb-4de3-9015-696acbf9c5ae") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 19:55:32.612856 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:32.612816 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/92ba2761-16cb-4de3-9015-696acbf9c5ae-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-tfrkd\" (UID: \"92ba2761-16cb-4de3-9015-696acbf9c5ae\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" Apr 16 19:55:32.615637 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:32.615609 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/92ba2761-16cb-4de3-9015-696acbf9c5ae-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-tfrkd\" (UID: \"92ba2761-16cb-4de3-9015-696acbf9c5ae\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" Apr 16 19:55:32.763403 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:32.763370 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" Apr 16 19:55:33.428364 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.428335 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd"] Apr 16 19:55:33.429659 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.429597 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" podUID="01daca45-c3aa-4353-91ac-68cb75bf0890" containerName="registry" containerID="cri-o://1299cb067e664737973d957edc44bdada5195b541906915aee4ac90d7682b09e" gracePeriod=30 Apr 16 19:55:33.431480 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:33.431443 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ba2761_16cb_4de3_9015_696acbf9c5ae.slice/crio-209b0d17c4fb3ddceadf3c415a439982a6c922fbb4ed5d174e7ab58722c66bbf WatchSource:0}: Error finding container 209b0d17c4fb3ddceadf3c415a439982a6c922fbb4ed5d174e7ab58722c66bbf: Status 404 returned error can't find the container with id 209b0d17c4fb3ddceadf3c415a439982a6c922fbb4ed5d174e7ab58722c66bbf Apr 16 19:55:33.550409 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.550369 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-5gz5z" event={"ID":"2fcbb9c9-45d1-42ca-ad88-df53ea6940b7","Type":"ContainerStarted","Data":"60eb6c3f79aef34463c6ff2ffceb5e4c3d7f0cf426647dde28c6c4a25111e910"} Apr 16 19:55:33.550560 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.550493 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-5gz5z" Apr 16 19:55:33.552005 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.551974 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" event={"ID":"92ba2761-16cb-4de3-9015-696acbf9c5ae","Type":"ContainerStarted","Data":"209b0d17c4fb3ddceadf3c415a439982a6c922fbb4ed5d174e7ab58722c66bbf"} Apr 16 19:55:33.552344 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.552323 2560 patch_prober.go:28] interesting pod/downloads-6bcc868b7-5gz5z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.10:8080/\": dial tcp 10.133.0.10:8080: connect: connection refused" start-of-body= Apr 16 19:55:33.552448 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.552371 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-5gz5z" podUID="2fcbb9c9-45d1-42ca-ad88-df53ea6940b7" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.10:8080/\": dial tcp 10.133.0.10:8080: connect: connection refused" Apr 16 19:55:33.553454 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.553427 2560 generic.go:358] "Generic (PLEG): container finished" podID="01daca45-c3aa-4353-91ac-68cb75bf0890" containerID="1299cb067e664737973d957edc44bdada5195b541906915aee4ac90d7682b09e" exitCode=0 Apr 16 19:55:33.553636 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.553482 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" event={"ID":"01daca45-c3aa-4353-91ac-68cb75bf0890","Type":"ContainerDied","Data":"1299cb067e664737973d957edc44bdada5195b541906915aee4ac90d7682b09e"} Apr 16 19:55:33.568375 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.568305 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-5gz5z" podStartSLOduration=0.763024051 podStartE2EDuration="16.568292789s" podCreationTimestamp="2026-04-16 19:55:17 +0000 UTC" firstStartedPulling="2026-04-16 19:55:17.60146644 +0000 UTC m=+83.934847740" lastFinishedPulling="2026-04-16 19:55:33.406735172 +0000 UTC m=+99.740116478" observedRunningTime="2026-04-16 19:55:33.567196183 +0000 UTC m=+99.900577506" watchObservedRunningTime="2026-04-16 19:55:33.568292789 +0000 UTC m=+99.901674111" Apr 16 19:55:33.663185 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.663161 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:55:33.823489 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823391 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-certificates\") pod \"01daca45-c3aa-4353-91ac-68cb75bf0890\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " Apr 16 19:55:33.823489 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823461 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvklv\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-kube-api-access-dvklv\") pod \"01daca45-c3aa-4353-91ac-68cb75bf0890\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " Apr 16 19:55:33.823698 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823566 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-trusted-ca\") pod \"01daca45-c3aa-4353-91ac-68cb75bf0890\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " Apr 16 19:55:33.823698 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823609 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-installation-pull-secrets\") pod \"01daca45-c3aa-4353-91ac-68cb75bf0890\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " Apr 16 19:55:33.823698 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823651 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") pod \"01daca45-c3aa-4353-91ac-68cb75bf0890\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " Apr 16 19:55:33.823698 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823680 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/01daca45-c3aa-4353-91ac-68cb75bf0890-ca-trust-extracted\") pod \"01daca45-c3aa-4353-91ac-68cb75bf0890\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " Apr 16 19:55:33.823886 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823717 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-image-registry-private-configuration\") pod \"01daca45-c3aa-4353-91ac-68cb75bf0890\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " Apr 16 19:55:33.823886 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823760 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-bound-sa-token\") pod \"01daca45-c3aa-4353-91ac-68cb75bf0890\" (UID: \"01daca45-c3aa-4353-91ac-68cb75bf0890\") " Apr 16 19:55:33.823886 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823829 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "01daca45-c3aa-4353-91ac-68cb75bf0890" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:33.824038 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.823950 2560 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-certificates\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:55:33.824765 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.824727 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "01daca45-c3aa-4353-91ac-68cb75bf0890" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:33.826349 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.826266 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "01daca45-c3aa-4353-91ac-68cb75bf0890" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:33.826349 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.826307 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "01daca45-c3aa-4353-91ac-68cb75bf0890" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:33.826533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.826488 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-kube-api-access-dvklv" (OuterVolumeSpecName: "kube-api-access-dvklv") pod "01daca45-c3aa-4353-91ac-68cb75bf0890" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890"). InnerVolumeSpecName "kube-api-access-dvklv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:33.826785 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.826763 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "01daca45-c3aa-4353-91ac-68cb75bf0890" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:33.827867 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.827836 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "01daca45-c3aa-4353-91ac-68cb75bf0890" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:33.846002 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.845974 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01daca45-c3aa-4353-91ac-68cb75bf0890-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "01daca45-c3aa-4353-91ac-68cb75bf0890" (UID: "01daca45-c3aa-4353-91ac-68cb75bf0890"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:55:33.925055 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.925017 2560 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01daca45-c3aa-4353-91ac-68cb75bf0890-trusted-ca\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:55:33.925055 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.925051 2560 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-installation-pull-secrets\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:55:33.925301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.925066 2560 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-registry-tls\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:55:33.925301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.925084 2560 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/01daca45-c3aa-4353-91ac-68cb75bf0890-ca-trust-extracted\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:55:33.925301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.925100 2560 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/01daca45-c3aa-4353-91ac-68cb75bf0890-image-registry-private-configuration\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:55:33.925301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.925136 2560 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-bound-sa-token\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:55:33.925301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:33.925153 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dvklv\" (UniqueName: \"kubernetes.io/projected/01daca45-c3aa-4353-91ac-68cb75bf0890-kube-api-access-dvklv\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:55:34.474800 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:34.474359 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vxhp2" Apr 16 19:55:34.557771 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:34.557685 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" Apr 16 19:55:34.557771 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:34.557740 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67cbbd8898-vr2dm" event={"ID":"01daca45-c3aa-4353-91ac-68cb75bf0890","Type":"ContainerDied","Data":"4d8f60ced0d8b67670b73bcb33b8067356fce97f055fafc49ce6598cc8cbabb2"} Apr 16 19:55:34.557993 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:34.557780 2560 scope.go:117] "RemoveContainer" containerID="1299cb067e664737973d957edc44bdada5195b541906915aee4ac90d7682b09e" Apr 16 19:55:34.571102 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:34.571079 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-5gz5z" Apr 16 19:55:34.576215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:34.576181 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67cbbd8898-vr2dm"] Apr 16 19:55:34.583212 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:34.583067 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67cbbd8898-vr2dm"] Apr 16 19:55:35.563401 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.563368 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" event={"ID":"92ba2761-16cb-4de3-9015-696acbf9c5ae","Type":"ContainerStarted","Data":"9f8e410bcc3f8ccbf10ae25f40cb3d130283d58f0acbc3370ad4f072322ecaac"} Apr 16 19:55:35.579475 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.579413 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" podStartSLOduration=2.898239479 podStartE2EDuration="4.579395777s" podCreationTimestamp="2026-04-16 19:55:31 +0000 UTC" firstStartedPulling="2026-04-16 19:55:33.43320732 +0000 UTC m=+99.766588620" lastFinishedPulling="2026-04-16 19:55:35.114363616 +0000 UTC m=+101.447744918" observedRunningTime="2026-04-16 19:55:35.578849117 +0000 UTC m=+101.912230467" watchObservedRunningTime="2026-04-16 19:55:35.579395777 +0000 UTC m=+101.912777098" Apr 16 19:55:35.696953 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.696921 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-689b7596d9-x4xwj"] Apr 16 19:55:35.697221 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.697203 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01daca45-c3aa-4353-91ac-68cb75bf0890" containerName="registry" Apr 16 19:55:35.697317 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.697224 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="01daca45-c3aa-4353-91ac-68cb75bf0890" containerName="registry" Apr 16 19:55:35.697317 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.697285 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="01daca45-c3aa-4353-91ac-68cb75bf0890" containerName="registry" Apr 16 19:55:35.719841 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.719805 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-689b7596d9-x4xwj"] Apr 16 19:55:35.720005 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.719929 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.723988 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.723957 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 19:55:35.724172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.724146 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 19:55:35.724239 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.724218 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 19:55:35.724294 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.724241 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 19:55:35.724339 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.724218 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 19:55:35.724403 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.724385 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nt2zh\"" Apr 16 19:55:35.837673 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.837592 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-oauth-serving-cert\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.837673 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.837641 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-console-config\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.837898 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.837735 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-oauth-config\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.837898 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.837791 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-service-ca\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.837898 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.837853 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-serving-cert\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.838053 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.837900 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r2jt\" (UniqueName: \"kubernetes.io/projected/95b712db-666a-4fc6-8855-ad59a92b9365-kube-api-access-9r2jt\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.938639 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.938600 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r2jt\" (UniqueName: \"kubernetes.io/projected/95b712db-666a-4fc6-8855-ad59a92b9365-kube-api-access-9r2jt\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.938815 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.938652 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-oauth-serving-cert\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.938815 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.938687 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-console-config\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.938815 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.938720 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-oauth-config\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.939025 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.938880 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-service-ca\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.939025 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.938946 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-serving-cert\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.939395 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.939366 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-oauth-serving-cert\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.939527 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.939438 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-console-config\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.939624 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.939586 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-service-ca\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.941498 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.941474 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-serving-cert\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.941586 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.941518 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-oauth-config\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:35.958127 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:35.958096 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r2jt\" (UniqueName: \"kubernetes.io/projected/95b712db-666a-4fc6-8855-ad59a92b9365-kube-api-access-9r2jt\") pod \"console-689b7596d9-x4xwj\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:36.033771 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.033730 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:36.171297 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.171263 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-689b7596d9-x4xwj"] Apr 16 19:55:36.174655 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:36.174628 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95b712db_666a_4fc6_8855_ad59a92b9365.slice/crio-1c4e099f327b8d2ca2025817722bfd5e1e45429ddb84e84a89a07cc174a3290b WatchSource:0}: Error finding container 1c4e099f327b8d2ca2025817722bfd5e1e45429ddb84e84a89a07cc174a3290b: Status 404 returned error can't find the container with id 1c4e099f327b8d2ca2025817722bfd5e1e45429ddb84e84a89a07cc174a3290b Apr 16 19:55:36.222606 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.222568 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01daca45-c3aa-4353-91ac-68cb75bf0890" path="/var/lib/kubelet/pods/01daca45-c3aa-4353-91ac-68cb75bf0890/volumes" Apr 16 19:55:36.567370 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.567332 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689b7596d9-x4xwj" event={"ID":"95b712db-666a-4fc6-8855-ad59a92b9365","Type":"ContainerStarted","Data":"1c4e099f327b8d2ca2025817722bfd5e1e45429ddb84e84a89a07cc174a3290b"} Apr 16 19:55:36.567841 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.567632 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" Apr 16 19:55:36.572963 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.572939 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-tfrkd" Apr 16 19:55:36.931840 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.931721 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7jvps"] Apr 16 19:55:36.952703 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.952674 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7jvps"] Apr 16 19:55:36.952957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.952938 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:36.956427 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.956329 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-ccpz7\"" Apr 16 19:55:36.956427 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.956392 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 19:55:36.956688 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.956632 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:55:36.958340 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.957849 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:55:36.958340 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.957988 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 19:55:36.958340 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:36.958148 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:55:37.047933 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.047898 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/11a83071-15af-481e-866a-601e7919b2a3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.048199 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.047952 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11a83071-15af-481e-866a-601e7919b2a3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.048199 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.047994 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2chh6\" (UniqueName: \"kubernetes.io/projected/11a83071-15af-481e-866a-601e7919b2a3-kube-api-access-2chh6\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.048381 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.048221 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11a83071-15af-481e-866a-601e7919b2a3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.149514 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.149435 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11a83071-15af-481e-866a-601e7919b2a3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.149792 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.149759 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/11a83071-15af-481e-866a-601e7919b2a3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.150842 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.149804 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11a83071-15af-481e-866a-601e7919b2a3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.150842 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.149840 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2chh6\" (UniqueName: \"kubernetes.io/projected/11a83071-15af-481e-866a-601e7919b2a3-kube-api-access-2chh6\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.150842 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:55:37.150188 2560 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 19:55:37.150842 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:55:37.150435 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11a83071-15af-481e-866a-601e7919b2a3-prometheus-operator-tls podName:11a83071-15af-481e-866a-601e7919b2a3 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:37.650392393 +0000 UTC m=+103.983773698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/11a83071-15af-481e-866a-601e7919b2a3-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-7jvps" (UID: "11a83071-15af-481e-866a-601e7919b2a3") : secret "prometheus-operator-tls" not found Apr 16 19:55:37.151641 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.151408 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11a83071-15af-481e-866a-601e7919b2a3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.152829 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.152803 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11a83071-15af-481e-866a-601e7919b2a3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.166490 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.166463 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2chh6\" (UniqueName: \"kubernetes.io/projected/11a83071-15af-481e-866a-601e7919b2a3-kube-api-access-2chh6\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.655227 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.655190 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/11a83071-15af-481e-866a-601e7919b2a3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.658476 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.658414 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/11a83071-15af-481e-866a-601e7919b2a3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7jvps\" (UID: \"11a83071-15af-481e-866a-601e7919b2a3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:37.866001 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:37.865928 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" Apr 16 19:55:38.007149 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:38.007073 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7jvps"] Apr 16 19:55:38.011060 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:38.011029 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a83071_15af_481e_866a_601e7919b2a3.slice/crio-ed63e5b9286d5449f779995ed97280ff60df2df19a299a0595b780076200dd0b WatchSource:0}: Error finding container ed63e5b9286d5449f779995ed97280ff60df2df19a299a0595b780076200dd0b: Status 404 returned error can't find the container with id ed63e5b9286d5449f779995ed97280ff60df2df19a299a0595b780076200dd0b Apr 16 19:55:38.575929 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:38.575885 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" event={"ID":"11a83071-15af-481e-866a-601e7919b2a3","Type":"ContainerStarted","Data":"ed63e5b9286d5449f779995ed97280ff60df2df19a299a0595b780076200dd0b"} Apr 16 19:55:40.583994 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:40.583895 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689b7596d9-x4xwj" event={"ID":"95b712db-666a-4fc6-8855-ad59a92b9365","Type":"ContainerStarted","Data":"1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6"} Apr 16 19:55:40.604089 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:40.603962 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-689b7596d9-x4xwj" podStartSLOduration=1.9687718520000002 podStartE2EDuration="5.603944344s" podCreationTimestamp="2026-04-16 19:55:35 +0000 UTC" firstStartedPulling="2026-04-16 19:55:36.176908677 +0000 UTC m=+102.510289980" lastFinishedPulling="2026-04-16 19:55:39.812081173 +0000 UTC m=+106.145462472" observedRunningTime="2026-04-16 19:55:40.602995431 +0000 UTC m=+106.936376764" watchObservedRunningTime="2026-04-16 19:55:40.603944344 +0000 UTC m=+106.937325666" Apr 16 19:55:41.588847 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:41.588805 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" event={"ID":"11a83071-15af-481e-866a-601e7919b2a3","Type":"ContainerStarted","Data":"b25a41ada766ba77757d0900f170169f0bdb497f0500386c33f3db454d920795"} Apr 16 19:55:41.588847 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:41.588851 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" event={"ID":"11a83071-15af-481e-866a-601e7919b2a3","Type":"ContainerStarted","Data":"244fcc0ef89d96778bfbe66db962d5d3ff75a03a08fe66cb9a8a01497525512b"} Apr 16 19:55:41.606338 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:41.606287 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-7jvps" podStartSLOduration=2.571471693 podStartE2EDuration="5.606269246s" podCreationTimestamp="2026-04-16 19:55:36 +0000 UTC" firstStartedPulling="2026-04-16 19:55:38.013588796 +0000 UTC m=+104.346970116" lastFinishedPulling="2026-04-16 19:55:41.048386367 +0000 UTC m=+107.381767669" observedRunningTime="2026-04-16 19:55:41.604525863 +0000 UTC m=+107.937907185" watchObservedRunningTime="2026-04-16 19:55:41.606269246 +0000 UTC m=+107.939650569" Apr 16 19:55:43.265032 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.264998 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk"] Apr 16 19:55:43.281442 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.281409 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-hsz8z"] Apr 16 19:55:43.281734 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.281702 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.284319 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.284258 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8t6bh\"" Apr 16 19:55:43.284429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.284403 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:55:43.285065 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.285025 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 19:55:43.302512 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.302489 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-hsz8z"] Apr 16 19:55:43.302512 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.302516 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk"] Apr 16 19:55:43.302665 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.302615 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.309512 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.309488 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 19:55:43.311043 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.311013 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ff6mc"] Apr 16 19:55:43.312787 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.312771 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:55:43.313008 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.312993 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-qz6nq\"" Apr 16 19:55:43.313265 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.313251 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 19:55:43.332456 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.332432 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.341752 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.341733 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:55:43.341959 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.341945 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wxw92\"" Apr 16 19:55:43.344621 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.344605 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:55:43.348365 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.348352 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:55:43.399699 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.399672 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9fec3bd-e435-4443-8f93-f18406c9bc9a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.399818 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.399705 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6469\" (UniqueName: \"kubernetes.io/projected/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-api-access-z6469\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.399818 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.399726 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e34fff2f-e990-435e-ada9-a8f5ac7799cc-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.399818 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.399743 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.399818 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.399805 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.399981 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.399919 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9fec3bd-e435-4443-8f93-f18406c9bc9a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.399981 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.399949 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.399981 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.399971 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9fec3bd-e435-4443-8f93-f18406c9bc9a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.400091 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.400009 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e34fff2f-e990-435e-ada9-a8f5ac7799cc-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.400091 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.400034 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9dsg\" (UniqueName: \"kubernetes.io/projected/b9fec3bd-e435-4443-8f93-f18406c9bc9a-kube-api-access-n9dsg\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.500735 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.500703 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e34fff2f-e990-435e-ada9-a8f5ac7799cc-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.500894 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.500743 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9dsg\" (UniqueName: \"kubernetes.io/projected/b9fec3bd-e435-4443-8f93-f18406c9bc9a-kube-api-access-n9dsg\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.500894 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.500774 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.500894 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.500806 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9fec3bd-e435-4443-8f93-f18406c9bc9a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.500894 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.500830 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4bf8\" (UniqueName: \"kubernetes.io/projected/b05e9c85-30b0-49cf-a0ed-17c42870fe63-kube-api-access-r4bf8\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.500894 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.500855 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-tls\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.500894 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.500880 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-textfile\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.501203 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.500995 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6469\" (UniqueName: \"kubernetes.io/projected/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-api-access-z6469\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.501203 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501028 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e34fff2f-e990-435e-ada9-a8f5ac7799cc-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.501203 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501052 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.501203 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501081 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-wtmp\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.501203 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501102 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.501203 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501154 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b05e9c85-30b0-49cf-a0ed-17c42870fe63-sys\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.501203 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501182 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-accelerators-collector-config\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.501203 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:55:43.501204 2560 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 19:55:43.501587 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501223 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9fec3bd-e435-4443-8f93-f18406c9bc9a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.501587 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501250 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b05e9c85-30b0-49cf-a0ed-17c42870fe63-metrics-client-ca\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.501587 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:55:43.501265 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-tls podName:e34fff2f-e990-435e-ada9-a8f5ac7799cc nodeName:}" failed. No retries permitted until 2026-04-16 19:55:44.001244842 +0000 UTC m=+110.334626156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-hsz8z" (UID: "e34fff2f-e990-435e-ada9-a8f5ac7799cc") : secret "kube-state-metrics-tls" not found Apr 16 19:55:43.501587 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501296 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b05e9c85-30b0-49cf-a0ed-17c42870fe63-root\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.501587 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501319 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.501587 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501346 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9fec3bd-e435-4443-8f93-f18406c9bc9a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.501587 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501394 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e34fff2f-e990-435e-ada9-a8f5ac7799cc-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.501587 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501506 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e34fff2f-e990-435e-ada9-a8f5ac7799cc-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.501923 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501617 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9fec3bd-e435-4443-8f93-f18406c9bc9a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.501923 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.501892 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.503515 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.503488 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9fec3bd-e435-4443-8f93-f18406c9bc9a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.503713 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.503693 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.503799 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.503748 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9fec3bd-e435-4443-8f93-f18406c9bc9a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.511784 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.511748 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9dsg\" (UniqueName: \"kubernetes.io/projected/b9fec3bd-e435-4443-8f93-f18406c9bc9a-kube-api-access-n9dsg\") pod \"openshift-state-metrics-9d44df66c-2tlfk\" (UID: \"b9fec3bd-e435-4443-8f93-f18406c9bc9a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.511926 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.511908 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6469\" (UniqueName: \"kubernetes.io/projected/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-api-access-z6469\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:43.592237 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.592173 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" Apr 16 19:55:43.601991 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.601969 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b05e9c85-30b0-49cf-a0ed-17c42870fe63-metrics-client-ca\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.601999 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b05e9c85-30b0-49cf-a0ed-17c42870fe63-root\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602035 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602067 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4bf8\" (UniqueName: \"kubernetes.io/projected/b05e9c85-30b0-49cf-a0ed-17c42870fe63-kube-api-access-r4bf8\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602097 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602091 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-tls\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602322 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602100 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b05e9c85-30b0-49cf-a0ed-17c42870fe63-root\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602322 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602230 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-textfile\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602322 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602305 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-wtmp\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602463 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602335 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b05e9c85-30b0-49cf-a0ed-17c42870fe63-sys\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602463 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602368 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-accelerators-collector-config\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602562 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602477 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b05e9c85-30b0-49cf-a0ed-17c42870fe63-sys\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602655 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b05e9c85-30b0-49cf-a0ed-17c42870fe63-metrics-client-ca\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602670 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-wtmp\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602796 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602697 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-textfile\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.602967 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.602939 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-accelerators-collector-config\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.604490 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.604470 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-tls\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.604874 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.604852 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b05e9c85-30b0-49cf-a0ed-17c42870fe63-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.610766 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.610748 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4bf8\" (UniqueName: \"kubernetes.io/projected/b05e9c85-30b0-49cf-a0ed-17c42870fe63-kube-api-access-r4bf8\") pod \"node-exporter-ff6mc\" (UID: \"b05e9c85-30b0-49cf-a0ed-17c42870fe63\") " pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.640753 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.640723 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ff6mc" Apr 16 19:55:43.649958 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:43.649910 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb05e9c85_30b0_49cf_a0ed_17c42870fe63.slice/crio-ada052be98da7ef615bcb7a815e5a8fdd8e2df5d486241f9997306dbea23bde3 WatchSource:0}: Error finding container ada052be98da7ef615bcb7a815e5a8fdd8e2df5d486241f9997306dbea23bde3: Status 404 returned error can't find the container with id ada052be98da7ef615bcb7a815e5a8fdd8e2df5d486241f9997306dbea23bde3 Apr 16 19:55:43.714157 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:43.714122 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk"] Apr 16 19:55:43.717276 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:43.717246 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9fec3bd_e435_4443_8f93_f18406c9bc9a.slice/crio-e39728441c2cd64b40ad5d21a61620b94152c65734f8295d52d110d97cf4088a WatchSource:0}: Error finding container e39728441c2cd64b40ad5d21a61620b94152c65734f8295d52d110d97cf4088a: Status 404 returned error can't find the container with id e39728441c2cd64b40ad5d21a61620b94152c65734f8295d52d110d97cf4088a Apr 16 19:55:44.005407 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.005365 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:44.007645 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.007619 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e34fff2f-e990-435e-ada9-a8f5ac7799cc-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hsz8z\" (UID: \"e34fff2f-e990-435e-ada9-a8f5ac7799cc\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:44.211677 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.211643 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" Apr 16 19:55:44.349835 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.349707 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-hsz8z"] Apr 16 19:55:44.444999 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.444864 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:55:44.473576 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.473509 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:55:44.473902 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.473653 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.476615 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.476594 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:55:44.476731 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.476595 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:55:44.477009 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.476954 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:55:44.477009 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.476976 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:55:44.477198 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.477021 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:55:44.477198 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.477043 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-h2rrk\"" Apr 16 19:55:44.477198 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.477044 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:55:44.477198 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.476976 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:55:44.477399 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.477382 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:55:44.492371 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.492348 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:55:44.528744 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:44.528714 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode34fff2f_e990_435e_ada9_a8f5ac7799cc.slice/crio-5670ad91713830c1b58cb7e0bda37772c267d2a914022dc35e070893268ab76b WatchSource:0}: Error finding container 5670ad91713830c1b58cb7e0bda37772c267d2a914022dc35e070893268ab76b: Status 404 returned error can't find the container with id 5670ad91713830c1b58cb7e0bda37772c267d2a914022dc35e070893268ab76b Apr 16 19:55:44.598079 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.598047 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ff6mc" event={"ID":"b05e9c85-30b0-49cf-a0ed-17c42870fe63","Type":"ContainerStarted","Data":"ada052be98da7ef615bcb7a815e5a8fdd8e2df5d486241f9997306dbea23bde3"} Apr 16 19:55:44.599563 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.599532 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" event={"ID":"b9fec3bd-e435-4443-8f93-f18406c9bc9a","Type":"ContainerStarted","Data":"7ca73535bb1f73cd9f8a9afd71704f7e5fd8875b3ad8adc627343cbd4c47f6ef"} Apr 16 19:55:44.599689 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.599569 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" event={"ID":"b9fec3bd-e435-4443-8f93-f18406c9bc9a","Type":"ContainerStarted","Data":"d2ac04130dd878ef18bb44c7e39b62ecb766af909cf1059f9f5383df87629b67"} Apr 16 19:55:44.599689 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.599581 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" event={"ID":"b9fec3bd-e435-4443-8f93-f18406c9bc9a","Type":"ContainerStarted","Data":"e39728441c2cd64b40ad5d21a61620b94152c65734f8295d52d110d97cf4088a"} Apr 16 19:55:44.600435 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.600416 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" event={"ID":"e34fff2f-e990-435e-ada9-a8f5ac7799cc","Type":"ContainerStarted","Data":"5670ad91713830c1b58cb7e0bda37772c267d2a914022dc35e070893268ab76b"} Apr 16 19:55:44.611957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.611928 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612073 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.611983 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612073 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612028 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612187 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612069 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-web-config\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612187 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612100 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612187 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612156 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-volume\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612338 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612211 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612338 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612262 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612338 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612305 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-out\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612338 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612333 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtgq\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-kube-api-access-twtgq\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612471 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612387 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612471 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612416 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.612471 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.612462 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.712951 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.712918 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713096 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.712956 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713096 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.712975 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713096 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.712998 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-web-config\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713096 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713023 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713096 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713046 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-volume\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713096 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713065 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713374 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713096 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713374 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713257 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-out\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713374 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713305 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twtgq\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-kube-api-access-twtgq\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713374 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713336 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713374 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713366 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.713599 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713421 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.714273 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.713783 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.714693 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.714659 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.715226 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.715208 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.716215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.716189 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.716313 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.716190 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-volume\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.716313 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.716277 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.716641 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.716615 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.716751 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.716673 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.716921 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.716901 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-out\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.717080 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.717064 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.718026 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.718000 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-web-config\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.718506 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.718485 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.725011 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.724951 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twtgq\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-kube-api-access-twtgq\") pod \"alertmanager-main-0\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.785012 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.784980 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:44.975335 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:44.975311 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:55:44.978389 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:44.978356 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod315acf74_b875_49c3_9a7f_5cf0e546f3b7.slice/crio-ea58e065964564be8c30476d0751fdd910519938f8c3850f19bff81c41a2533d WatchSource:0}: Error finding container ea58e065964564be8c30476d0751fdd910519938f8c3850f19bff81c41a2533d: Status 404 returned error can't find the container with id ea58e065964564be8c30476d0751fdd910519938f8c3850f19bff81c41a2533d Apr 16 19:55:45.363332 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.363252 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp"] Apr 16 19:55:45.375270 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.375246 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.379160 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.379133 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 19:55:45.379430 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.379415 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7dh4pdbbqb4fl\"" Apr 16 19:55:45.379521 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.379505 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-rx6bt\"" Apr 16 19:55:45.379586 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.379528 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 19:55:45.379645 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.379587 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 19:55:45.379645 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.379603 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp"] Apr 16 19:55:45.379870 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.379853 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 19:55:45.379870 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.379859 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 19:55:45.520937 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.520899 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.521124 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.520948 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.521124 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.520984 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.521124 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.521051 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.521124 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.521082 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtw5\" (UniqueName: \"kubernetes.io/projected/b4cdbf7b-b55d-40a7-832b-619ef6754e08-kube-api-access-bhtw5\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.521301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.521126 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-tls\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.521301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.521214 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b4cdbf7b-b55d-40a7-832b-619ef6754e08-metrics-client-ca\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.521301 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.521271 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-grpc-tls\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.605169 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.605136 2560 generic.go:358] "Generic (PLEG): container finished" podID="b05e9c85-30b0-49cf-a0ed-17c42870fe63" containerID="583f1ce2ecc52a17a8ff69661249a1275e43d4c6b642ed6a67b52e2f850a7bca" exitCode=0 Apr 16 19:55:45.605346 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.605213 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ff6mc" event={"ID":"b05e9c85-30b0-49cf-a0ed-17c42870fe63","Type":"ContainerDied","Data":"583f1ce2ecc52a17a8ff69661249a1275e43d4c6b642ed6a67b52e2f850a7bca"} Apr 16 19:55:45.606472 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.606448 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerStarted","Data":"ea58e065964564be8c30476d0751fdd910519938f8c3850f19bff81c41a2533d"} Apr 16 19:55:45.622722 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.622645 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-grpc-tls\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.622722 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.622694 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.622899 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.622724 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.622899 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.622757 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.622899 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.622803 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.622899 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.622833 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhtw5\" (UniqueName: \"kubernetes.io/projected/b4cdbf7b-b55d-40a7-832b-619ef6754e08-kube-api-access-bhtw5\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.622899 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.622857 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-tls\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.623331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.622903 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b4cdbf7b-b55d-40a7-832b-619ef6754e08-metrics-client-ca\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.625408 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.625380 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b4cdbf7b-b55d-40a7-832b-619ef6754e08-metrics-client-ca\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.625772 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.625745 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.625851 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.625758 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-grpc-tls\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.626452 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.626421 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.627701 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.627676 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.627822 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.627777 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.627934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.627914 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b4cdbf7b-b55d-40a7-832b-619ef6754e08-secret-thanos-querier-tls\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.639474 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.639454 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhtw5\" (UniqueName: \"kubernetes.io/projected/b4cdbf7b-b55d-40a7-832b-619ef6754e08-kube-api-access-bhtw5\") pod \"thanos-querier-84b5f7ccf8-r95kp\" (UID: \"b4cdbf7b-b55d-40a7-832b-619ef6754e08\") " pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:45.685643 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:45.685613 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:46.034795 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:46.034757 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:46.035010 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:46.034811 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:46.039529 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:46.039497 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:46.598420 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:46.598396 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp"] Apr 16 19:55:46.602433 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:46.602409 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cdbf7b_b55d_40a7_832b_619ef6754e08.slice/crio-2ea99a496f49fe1085e0376a474817a20b5e34abfc6d4841ced3be3d5105a642 WatchSource:0}: Error finding container 2ea99a496f49fe1085e0376a474817a20b5e34abfc6d4841ced3be3d5105a642: Status 404 returned error can't find the container with id 2ea99a496f49fe1085e0376a474817a20b5e34abfc6d4841ced3be3d5105a642 Apr 16 19:55:46.620446 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:46.620418 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ff6mc" event={"ID":"b05e9c85-30b0-49cf-a0ed-17c42870fe63","Type":"ContainerStarted","Data":"f61053309855b55d9289e5dd365b4a0f62dc045e66a06ee98bf24e8d7f4fd219"} Apr 16 19:55:46.621481 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:46.621455 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" event={"ID":"b4cdbf7b-b55d-40a7-832b-619ef6754e08","Type":"ContainerStarted","Data":"2ea99a496f49fe1085e0376a474817a20b5e34abfc6d4841ced3be3d5105a642"} Apr 16 19:55:46.626165 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:46.626147 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:55:47.625391 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.625355 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" event={"ID":"e34fff2f-e990-435e-ada9-a8f5ac7799cc","Type":"ContainerStarted","Data":"6780d593b82600b98f886eb2d443f1beb1b5645ee1b04bfb678f99b247b71ffe"} Apr 16 19:55:47.625391 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.625398 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" event={"ID":"e34fff2f-e990-435e-ada9-a8f5ac7799cc","Type":"ContainerStarted","Data":"5b14f1b425c9e5386fa826c5224b400d6a705c57ef7391ea8feb4c3a7b8606f2"} Apr 16 19:55:47.625998 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.625408 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" event={"ID":"e34fff2f-e990-435e-ada9-a8f5ac7799cc","Type":"ContainerStarted","Data":"0a2d8fdebcbbe0205e76e20c1bcbce530fce0030aa249b1dd5ddbc4a3bcb6f1e"} Apr 16 19:55:47.627288 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.627261 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ff6mc" event={"ID":"b05e9c85-30b0-49cf-a0ed-17c42870fe63","Type":"ContainerStarted","Data":"291f7c44164120f6b4887b43339c70c7d4a72181d5fecd2dedeb0745df002a04"} Apr 16 19:55:47.629529 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.629490 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" event={"ID":"b9fec3bd-e435-4443-8f93-f18406c9bc9a","Type":"ContainerStarted","Data":"4fa8541472cc5fd47c52092351b11baa5f119d94d5029671a0685ed2608b82b3"} Apr 16 19:55:47.630861 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.630836 2560 generic.go:358] "Generic (PLEG): container finished" podID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerID="f0684766d67b543df695c8b8f9da36c1f49c620f4e4e123d7d003702aa2bfb3a" exitCode=0 Apr 16 19:55:47.630985 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.630925 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerDied","Data":"f0684766d67b543df695c8b8f9da36c1f49c620f4e4e123d7d003702aa2bfb3a"} Apr 16 19:55:47.644639 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.644597 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-hsz8z" podStartSLOduration=2.713323339 podStartE2EDuration="4.644583317s" podCreationTimestamp="2026-04-16 19:55:43 +0000 UTC" firstStartedPulling="2026-04-16 19:55:44.530612306 +0000 UTC m=+110.863993605" lastFinishedPulling="2026-04-16 19:55:46.461872281 +0000 UTC m=+112.795253583" observedRunningTime="2026-04-16 19:55:47.643046415 +0000 UTC m=+113.976427737" watchObservedRunningTime="2026-04-16 19:55:47.644583317 +0000 UTC m=+113.977964643" Apr 16 19:55:47.662258 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.662213 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2tlfk" podStartSLOduration=2.193042974 podStartE2EDuration="4.662201321s" podCreationTimestamp="2026-04-16 19:55:43 +0000 UTC" firstStartedPulling="2026-04-16 19:55:43.992581104 +0000 UTC m=+110.325962403" lastFinishedPulling="2026-04-16 19:55:46.461739438 +0000 UTC m=+112.795120750" observedRunningTime="2026-04-16 19:55:47.661729117 +0000 UTC m=+113.995110498" watchObservedRunningTime="2026-04-16 19:55:47.662201321 +0000 UTC m=+113.995582643" Apr 16 19:55:47.680308 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:47.680264 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ff6mc" podStartSLOduration=3.500086136 podStartE2EDuration="4.680250172s" podCreationTimestamp="2026-04-16 19:55:43 +0000 UTC" firstStartedPulling="2026-04-16 19:55:43.652377759 +0000 UTC m=+109.985759061" lastFinishedPulling="2026-04-16 19:55:44.832541792 +0000 UTC m=+111.165923097" observedRunningTime="2026-04-16 19:55:47.679514756 +0000 UTC m=+114.012896078" watchObservedRunningTime="2026-04-16 19:55:47.680250172 +0000 UTC m=+114.013631499" Apr 16 19:55:48.063594 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.063557 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm"] Apr 16 19:55:48.088246 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.088220 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm"] Apr 16 19:55:48.088404 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.088360 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" Apr 16 19:55:48.091628 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.091376 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 19:55:48.091628 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.091509 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-c5f8p\"" Apr 16 19:55:48.096977 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.096953 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c6bd8d65f-mn94p"] Apr 16 19:55:48.118594 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.118569 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6bd8d65f-mn94p"] Apr 16 19:55:48.118739 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.118698 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.125996 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.125969 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 19:55:48.246565 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.246527 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-oauth-config\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.246753 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.246574 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-service-ca\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.246753 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.246632 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-oauth-serving-cert\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.246753 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.246675 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e0986774-86cf-4b7b-9210-db0b4d7d82f9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qbxfm\" (UID: \"e0986774-86cf-4b7b-9210-db0b4d7d82f9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" Apr 16 19:55:48.246753 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.246699 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-console-config\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.246753 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.246738 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-trusted-ca-bundle\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.247030 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.246789 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vfdx\" (UniqueName: \"kubernetes.io/projected/51d38236-bd05-418d-8307-b0bf54184953-kube-api-access-4vfdx\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.247030 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.246872 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-serving-cert\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.348314 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.348228 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-trusted-ca-bundle\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.348314 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.348275 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vfdx\" (UniqueName: \"kubernetes.io/projected/51d38236-bd05-418d-8307-b0bf54184953-kube-api-access-4vfdx\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.348314 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.348305 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-serving-cert\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.348589 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.348359 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-oauth-config\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.348918 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.348739 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-service-ca\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.348918 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.348819 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-oauth-serving-cert\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.348918 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.348866 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e0986774-86cf-4b7b-9210-db0b4d7d82f9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qbxfm\" (UID: \"e0986774-86cf-4b7b-9210-db0b4d7d82f9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" Apr 16 19:55:48.348918 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.348913 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-console-config\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.349218 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:55:48.349144 2560 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 19:55:48.349218 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:55:48.349214 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0986774-86cf-4b7b-9210-db0b4d7d82f9-monitoring-plugin-cert podName:e0986774-86cf-4b7b-9210-db0b4d7d82f9 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:48.849193129 +0000 UTC m=+115.182574431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/e0986774-86cf-4b7b-9210-db0b4d7d82f9-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-qbxfm" (UID: "e0986774-86cf-4b7b-9210-db0b4d7d82f9") : secret "monitoring-plugin-cert" not found Apr 16 19:55:48.349443 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.349419 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-service-ca\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.349517 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.349452 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-console-config\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.349773 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.349733 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-oauth-serving-cert\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.351102 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.351080 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-oauth-config\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.351510 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.351493 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-serving-cert\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.352878 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.352856 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-trusted-ca-bundle\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.358089 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.358065 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vfdx\" (UniqueName: \"kubernetes.io/projected/51d38236-bd05-418d-8307-b0bf54184953-kube-api-access-4vfdx\") pod \"console-5c6bd8d65f-mn94p\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.429366 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.429327 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:48.571728 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.571700 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6bd8d65f-mn94p"] Apr 16 19:55:48.855529 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.855492 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e0986774-86cf-4b7b-9210-db0b4d7d82f9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qbxfm\" (UID: \"e0986774-86cf-4b7b-9210-db0b4d7d82f9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" Apr 16 19:55:48.858318 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:48.858291 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e0986774-86cf-4b7b-9210-db0b4d7d82f9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qbxfm\" (UID: \"e0986774-86cf-4b7b-9210-db0b4d7d82f9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" Apr 16 19:55:49.002660 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.002616 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" Apr 16 19:55:49.096179 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:49.096138 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51d38236_bd05_418d_8307_b0bf54184953.slice/crio-eb5b6fe46dcd6a24b69e7f43e0955c521cdf1a21a97c085ff8bb79e2351047f5 WatchSource:0}: Error finding container eb5b6fe46dcd6a24b69e7f43e0955c521cdf1a21a97c085ff8bb79e2351047f5: Status 404 returned error can't find the container with id eb5b6fe46dcd6a24b69e7f43e0955c521cdf1a21a97c085ff8bb79e2351047f5 Apr 16 19:55:49.535981 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.535949 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:49.550870 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.550840 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.552669 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.552599 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:49.553773 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.553747 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:55:49.553897 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.553800 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:55:49.554215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.553978 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:55:49.554215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.554010 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:55:49.554215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.554021 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:55:49.554215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.554063 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:55:49.554215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.554075 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:55:49.554215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.554019 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:55:49.554684 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.554666 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:55:49.554902 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.554848 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-b61u7l559ts5g\"" Apr 16 19:55:49.555033 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.554981 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:55:49.555303 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.555283 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sbnfd\"" Apr 16 19:55:49.555395 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.555287 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:55:49.557208 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.557190 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:55:49.561343 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.561322 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:55:49.638251 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.638211 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6bd8d65f-mn94p" event={"ID":"51d38236-bd05-418d-8307-b0bf54184953","Type":"ContainerStarted","Data":"eb5b6fe46dcd6a24b69e7f43e0955c521cdf1a21a97c085ff8bb79e2351047f5"} Apr 16 19:55:49.662712 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.662671 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.662880 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.662729 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.662880 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.662759 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-web-config\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.662880 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.662789 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.662880 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.662816 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config-out\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663030 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.662909 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663030 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.662938 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdpj7\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-kube-api-access-jdpj7\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663030 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.662963 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663030 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.662988 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663030 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.663015 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.663052 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.663073 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.663093 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.663130 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.663197 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663369 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.663224 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663369 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.663265 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.663369 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.663296 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.763894 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.763862 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.763894 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.763897 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.763922 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764124 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764186 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764377 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764220 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764377 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764246 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-web-config\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764377 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764280 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764377 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764310 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config-out\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764377 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764319 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764624 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764386 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764624 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764413 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdpj7\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-kube-api-access-jdpj7\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764624 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764447 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764624 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764474 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764624 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764506 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764624 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764542 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764624 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764577 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.764624 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764612 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.765022 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.764647 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.765406 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.765380 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.767104 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.766848 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.768183 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.767883 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.770091 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.768530 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-web-config\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.770091 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.768773 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config-out\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.770091 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.768837 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.770091 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.769143 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.770091 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.769510 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.771877 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.771845 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.771982 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.771854 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.771982 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.771881 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.772314 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.772258 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.772314 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.772260 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.773562 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.773508 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.774082 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.774037 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.774433 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.774388 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.781507 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.781172 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdpj7\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-kube-api-access-jdpj7\") pod \"prometheus-k8s-0\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:49.866338 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:49.866251 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:50.096082 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.096053 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm"] Apr 16 19:55:50.099500 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:50.099473 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0986774_86cf_4b7b_9210_db0b4d7d82f9.slice/crio-2bfd6e67f99a8c532ee17054e8090711233c0bf626ab0b287bd8b3fe5ad7c7f3 WatchSource:0}: Error finding container 2bfd6e67f99a8c532ee17054e8090711233c0bf626ab0b287bd8b3fe5ad7c7f3: Status 404 returned error can't find the container with id 2bfd6e67f99a8c532ee17054e8090711233c0bf626ab0b287bd8b3fe5ad7c7f3 Apr 16 19:55:50.118214 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.118156 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:50.122992 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:55:50.122933 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131e7fcc_a2a6_4516_bb92_7f51371d0ebc.slice/crio-9a4336224e8313439c1ef7c4d3a9f0152b6cb49a802f3e1df5a0519c63918459 WatchSource:0}: Error finding container 9a4336224e8313439c1ef7c4d3a9f0152b6cb49a802f3e1df5a0519c63918459: Status 404 returned error can't find the container with id 9a4336224e8313439c1ef7c4d3a9f0152b6cb49a802f3e1df5a0519c63918459 Apr 16 19:55:50.648234 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.648143 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerStarted","Data":"607878292de8f795b95dee98e8e7dd027a5e6d31fb3c5a0bdb3ca766fcadad86"} Apr 16 19:55:50.648234 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.648187 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerStarted","Data":"71c09f84716db30b31ad6f2ab0022048e02cda3b02e83e5c2bcbe9860536b8c4"} Apr 16 19:55:50.648234 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.648201 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerStarted","Data":"9a9fddd000d7c761e562b14b28b5ee64b25c5977b546892d70f2c070ae40d58b"} Apr 16 19:55:50.648234 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.648214 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerStarted","Data":"b71d38ea275698060da1b38c2d1fe37fd923f44b990f61e16627c97262185bfe"} Apr 16 19:55:50.648234 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.648227 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerStarted","Data":"a516d2752681bb11eb8091147531dd017bf0e0f9397d46b7995a11ddb63f5a24"} Apr 16 19:55:50.649752 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.649725 2560 generic.go:358] "Generic (PLEG): container finished" podID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerID="12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596" exitCode=0 Apr 16 19:55:50.649885 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.649818 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerDied","Data":"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596"} Apr 16 19:55:50.649885 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.649849 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerStarted","Data":"9a4336224e8313439c1ef7c4d3a9f0152b6cb49a802f3e1df5a0519c63918459"} Apr 16 19:55:50.651174 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.651152 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" event={"ID":"e0986774-86cf-4b7b-9210-db0b4d7d82f9","Type":"ContainerStarted","Data":"2bfd6e67f99a8c532ee17054e8090711233c0bf626ab0b287bd8b3fe5ad7c7f3"} Apr 16 19:55:50.652691 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.652653 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6bd8d65f-mn94p" event={"ID":"51d38236-bd05-418d-8307-b0bf54184953","Type":"ContainerStarted","Data":"96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9"} Apr 16 19:55:50.654535 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.654514 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" event={"ID":"b4cdbf7b-b55d-40a7-832b-619ef6754e08","Type":"ContainerStarted","Data":"fc6d719422dca0a0d04422bc4038054b028680d224ecdcb303f7dd1df372076f"} Apr 16 19:55:50.654637 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.654541 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" event={"ID":"b4cdbf7b-b55d-40a7-832b-619ef6754e08","Type":"ContainerStarted","Data":"0c4dc8434e611145bf262b77ef5c3642e608e5e4b611b363206321f7c009e715"} Apr 16 19:55:50.654637 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.654556 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" event={"ID":"b4cdbf7b-b55d-40a7-832b-619ef6754e08","Type":"ContainerStarted","Data":"d01bfd1c0988c7eb727c670134f5bfb78782e99f05005a19840e74a4b197d2f4"} Apr 16 19:55:50.698015 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:50.697963 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c6bd8d65f-mn94p" podStartSLOduration=2.697945241 podStartE2EDuration="2.697945241s" podCreationTimestamp="2026-04-16 19:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:50.697277758 +0000 UTC m=+117.030659080" watchObservedRunningTime="2026-04-16 19:55:50.697945241 +0000 UTC m=+117.031326587" Apr 16 19:55:52.666102 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.666067 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerStarted","Data":"04171d5dd0a1498d48fea984620f681adab5bd2c90c2d8d1dac75c048daafb7f"} Apr 16 19:55:52.667808 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.667782 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" event={"ID":"e0986774-86cf-4b7b-9210-db0b4d7d82f9","Type":"ContainerStarted","Data":"04fcefbb2d12af2b00c0eadd699a54c0eeb98dc47061d60048a850b4976a57fb"} Apr 16 19:55:52.667942 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.667914 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" Apr 16 19:55:52.671217 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.671180 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" event={"ID":"b4cdbf7b-b55d-40a7-832b-619ef6754e08","Type":"ContainerStarted","Data":"eaadf2c760b2d81db23749f84c8016d82cf432ac54c3efd11c5e48b492b779a6"} Apr 16 19:55:52.671217 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.671209 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" event={"ID":"b4cdbf7b-b55d-40a7-832b-619ef6754e08","Type":"ContainerStarted","Data":"5e0cb4d79d0e7261693d31de0214ae6d44ff07135c0d2bc62526a52e23c3cf08"} Apr 16 19:55:52.671388 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.671223 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" event={"ID":"b4cdbf7b-b55d-40a7-832b-619ef6754e08","Type":"ContainerStarted","Data":"7a0b005b8660e95ae59b092ab5670d2dc65b24bef6aac916103eda10cb98964c"} Apr 16 19:55:52.671501 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.671473 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:52.673810 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.673792 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" Apr 16 19:55:52.694669 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.694620 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.594807592 podStartE2EDuration="8.694604167s" podCreationTimestamp="2026-04-16 19:55:44 +0000 UTC" firstStartedPulling="2026-04-16 19:55:44.981007676 +0000 UTC m=+111.314388988" lastFinishedPulling="2026-04-16 19:55:52.080804264 +0000 UTC m=+118.414185563" observedRunningTime="2026-04-16 19:55:52.691689679 +0000 UTC m=+119.025071035" watchObservedRunningTime="2026-04-16 19:55:52.694604167 +0000 UTC m=+119.027985490" Apr 16 19:55:52.714525 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.713017 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" podStartSLOduration=2.237161875 podStartE2EDuration="7.712998616s" podCreationTimestamp="2026-04-16 19:55:45 +0000 UTC" firstStartedPulling="2026-04-16 19:55:46.604757097 +0000 UTC m=+112.938138398" lastFinishedPulling="2026-04-16 19:55:52.08059384 +0000 UTC m=+118.413975139" observedRunningTime="2026-04-16 19:55:52.712413667 +0000 UTC m=+119.045794989" watchObservedRunningTime="2026-04-16 19:55:52.712998616 +0000 UTC m=+119.046379940" Apr 16 19:55:52.732122 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:52.732045 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qbxfm" podStartSLOduration=2.754926062 podStartE2EDuration="4.732027166s" podCreationTimestamp="2026-04-16 19:55:48 +0000 UTC" firstStartedPulling="2026-04-16 19:55:50.101384609 +0000 UTC m=+116.434765909" lastFinishedPulling="2026-04-16 19:55:52.078485713 +0000 UTC m=+118.411867013" observedRunningTime="2026-04-16 19:55:52.730221496 +0000 UTC m=+119.063602818" watchObservedRunningTime="2026-04-16 19:55:52.732027166 +0000 UTC m=+119.065408487" Apr 16 19:55:54.033544 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:54.033506 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6bd8d65f-mn94p"] Apr 16 19:55:55.683513 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:55.683482 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerStarted","Data":"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e"} Apr 16 19:55:55.683513 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:55.683517 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerStarted","Data":"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8"} Apr 16 19:55:55.683949 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:55.683526 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerStarted","Data":"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf"} Apr 16 19:55:55.683949 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:55.683535 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerStarted","Data":"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491"} Apr 16 19:55:55.683949 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:55.683543 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerStarted","Data":"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd"} Apr 16 19:55:55.683949 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:55.683552 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerStarted","Data":"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29"} Apr 16 19:55:55.712297 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:55.712245 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.486226658 podStartE2EDuration="6.712230518s" podCreationTimestamp="2026-04-16 19:55:49 +0000 UTC" firstStartedPulling="2026-04-16 19:55:50.651239103 +0000 UTC m=+116.984620403" lastFinishedPulling="2026-04-16 19:55:54.877242951 +0000 UTC m=+121.210624263" observedRunningTime="2026-04-16 19:55:55.710086598 +0000 UTC m=+122.043467946" watchObservedRunningTime="2026-04-16 19:55:55.712230518 +0000 UTC m=+122.045611839" Apr 16 19:55:58.430143 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:58.430093 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:55:58.681405 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:58.681333 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-84b5f7ccf8-r95kp" Apr 16 19:55:59.867279 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:55:59.867240 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:04.887735 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:04.887698 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-689b7596d9-x4xwj"] Apr 16 19:56:17.023554 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:17.023524 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t7r5g_8b2bac48-99a7-47ac-b46a-269204d0bfe5/serve-healthcheck-canary/0.log" Apr 16 19:56:19.054889 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.054831 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c6bd8d65f-mn94p" podUID="51d38236-bd05-418d-8307-b0bf54184953" containerName="console" containerID="cri-o://96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9" gracePeriod=15 Apr 16 19:56:19.315121 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.315051 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6bd8d65f-mn94p_51d38236-bd05-418d-8307-b0bf54184953/console/0.log" Apr 16 19:56:19.315225 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.315128 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:56:19.426334 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426304 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-oauth-config\") pod \"51d38236-bd05-418d-8307-b0bf54184953\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " Apr 16 19:56:19.426520 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426353 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-console-config\") pod \"51d38236-bd05-418d-8307-b0bf54184953\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " Apr 16 19:56:19.426520 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426379 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-oauth-serving-cert\") pod \"51d38236-bd05-418d-8307-b0bf54184953\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " Apr 16 19:56:19.426520 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426422 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-trusted-ca-bundle\") pod \"51d38236-bd05-418d-8307-b0bf54184953\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " Apr 16 19:56:19.426520 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426444 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vfdx\" (UniqueName: \"kubernetes.io/projected/51d38236-bd05-418d-8307-b0bf54184953-kube-api-access-4vfdx\") pod \"51d38236-bd05-418d-8307-b0bf54184953\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " Apr 16 19:56:19.426520 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426509 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-service-ca\") pod \"51d38236-bd05-418d-8307-b0bf54184953\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " Apr 16 19:56:19.426786 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426545 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-serving-cert\") pod \"51d38236-bd05-418d-8307-b0bf54184953\" (UID: \"51d38236-bd05-418d-8307-b0bf54184953\") " Apr 16 19:56:19.426897 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426861 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "51d38236-bd05-418d-8307-b0bf54184953" (UID: "51d38236-bd05-418d-8307-b0bf54184953"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:19.426897 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426874 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "51d38236-bd05-418d-8307-b0bf54184953" (UID: "51d38236-bd05-418d-8307-b0bf54184953"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:19.427003 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426893 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-console-config" (OuterVolumeSpecName: "console-config") pod "51d38236-bd05-418d-8307-b0bf54184953" (UID: "51d38236-bd05-418d-8307-b0bf54184953"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:19.427003 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.426981 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-service-ca" (OuterVolumeSpecName: "service-ca") pod "51d38236-bd05-418d-8307-b0bf54184953" (UID: "51d38236-bd05-418d-8307-b0bf54184953"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:19.428608 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.428583 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "51d38236-bd05-418d-8307-b0bf54184953" (UID: "51d38236-bd05-418d-8307-b0bf54184953"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.428687 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.428601 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d38236-bd05-418d-8307-b0bf54184953-kube-api-access-4vfdx" (OuterVolumeSpecName: "kube-api-access-4vfdx") pod "51d38236-bd05-418d-8307-b0bf54184953" (UID: "51d38236-bd05-418d-8307-b0bf54184953"). InnerVolumeSpecName "kube-api-access-4vfdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:19.428687 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.428610 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "51d38236-bd05-418d-8307-b0bf54184953" (UID: "51d38236-bd05-418d-8307-b0bf54184953"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:19.527622 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.527586 2560 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-oauth-config\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.527622 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.527622 2560 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-console-config\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.527622 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.527631 2560 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-oauth-serving-cert\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.527622 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.527641 2560 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-trusted-ca-bundle\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.527865 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.527649 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4vfdx\" (UniqueName: \"kubernetes.io/projected/51d38236-bd05-418d-8307-b0bf54184953-kube-api-access-4vfdx\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.527865 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.527667 2560 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51d38236-bd05-418d-8307-b0bf54184953-service-ca\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.527865 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.527676 2560 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51d38236-bd05-418d-8307-b0bf54184953-console-serving-cert\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:19.758241 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.758211 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6bd8d65f-mn94p_51d38236-bd05-418d-8307-b0bf54184953/console/0.log" Apr 16 19:56:19.758411 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.758254 2560 generic.go:358] "Generic (PLEG): container finished" podID="51d38236-bd05-418d-8307-b0bf54184953" containerID="96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9" exitCode=2 Apr 16 19:56:19.758411 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.758300 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6bd8d65f-mn94p" event={"ID":"51d38236-bd05-418d-8307-b0bf54184953","Type":"ContainerDied","Data":"96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9"} Apr 16 19:56:19.758411 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.758331 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6bd8d65f-mn94p" Apr 16 19:56:19.758411 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.758344 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6bd8d65f-mn94p" event={"ID":"51d38236-bd05-418d-8307-b0bf54184953","Type":"ContainerDied","Data":"eb5b6fe46dcd6a24b69e7f43e0955c521cdf1a21a97c085ff8bb79e2351047f5"} Apr 16 19:56:19.758411 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.758363 2560 scope.go:117] "RemoveContainer" containerID="96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9" Apr 16 19:56:19.766690 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.766676 2560 scope.go:117] "RemoveContainer" containerID="96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9" Apr 16 19:56:19.766954 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:56:19.766936 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9\": container with ID starting with 96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9 not found: ID does not exist" containerID="96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9" Apr 16 19:56:19.767002 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.766962 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9"} err="failed to get container status \"96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9\": rpc error: code = NotFound desc = could not find container \"96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9\": container with ID starting with 96d1f8aae168e4a22260c43b1767626fa1fe4ee854cc5925374753fe9358cee9 not found: ID does not exist" Apr 16 19:56:19.779427 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.779400 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6bd8d65f-mn94p"] Apr 16 19:56:19.783354 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:19.783335 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c6bd8d65f-mn94p"] Apr 16 19:56:20.223616 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:20.223579 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d38236-bd05-418d-8307-b0bf54184953" path="/var/lib/kubelet/pods/51d38236-bd05-418d-8307-b0bf54184953/volumes" Apr 16 19:56:29.910482 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:29.910441 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-689b7596d9-x4xwj" podUID="95b712db-666a-4fc6-8855-ad59a92b9365" containerName="console" containerID="cri-o://1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6" gracePeriod=15 Apr 16 19:56:30.194029 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.194005 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-689b7596d9-x4xwj_95b712db-666a-4fc6-8855-ad59a92b9365/console/0.log" Apr 16 19:56:30.194190 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.194063 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:56:30.325517 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.325483 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-oauth-serving-cert\") pod \"95b712db-666a-4fc6-8855-ad59a92b9365\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " Apr 16 19:56:30.325699 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.325539 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-oauth-config\") pod \"95b712db-666a-4fc6-8855-ad59a92b9365\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " Apr 16 19:56:30.325699 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.325557 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r2jt\" (UniqueName: \"kubernetes.io/projected/95b712db-666a-4fc6-8855-ad59a92b9365-kube-api-access-9r2jt\") pod \"95b712db-666a-4fc6-8855-ad59a92b9365\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " Apr 16 19:56:30.325699 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.325647 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-console-config\") pod \"95b712db-666a-4fc6-8855-ad59a92b9365\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " Apr 16 19:56:30.325699 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.325685 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-serving-cert\") pod \"95b712db-666a-4fc6-8855-ad59a92b9365\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " Apr 16 19:56:30.325924 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.325713 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-service-ca\") pod \"95b712db-666a-4fc6-8855-ad59a92b9365\" (UID: \"95b712db-666a-4fc6-8855-ad59a92b9365\") " Apr 16 19:56:30.325924 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.325865 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "95b712db-666a-4fc6-8855-ad59a92b9365" (UID: "95b712db-666a-4fc6-8855-ad59a92b9365"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:30.326025 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.325951 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-console-config" (OuterVolumeSpecName: "console-config") pod "95b712db-666a-4fc6-8855-ad59a92b9365" (UID: "95b712db-666a-4fc6-8855-ad59a92b9365"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:30.326087 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.326051 2560 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-oauth-serving-cert\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:30.326087 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.326071 2560 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-console-config\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:30.326210 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.326186 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-service-ca" (OuterVolumeSpecName: "service-ca") pod "95b712db-666a-4fc6-8855-ad59a92b9365" (UID: "95b712db-666a-4fc6-8855-ad59a92b9365"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:30.327724 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.327692 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b712db-666a-4fc6-8855-ad59a92b9365-kube-api-access-9r2jt" (OuterVolumeSpecName: "kube-api-access-9r2jt") pod "95b712db-666a-4fc6-8855-ad59a92b9365" (UID: "95b712db-666a-4fc6-8855-ad59a92b9365"). InnerVolumeSpecName "kube-api-access-9r2jt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:30.327724 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.327712 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "95b712db-666a-4fc6-8855-ad59a92b9365" (UID: "95b712db-666a-4fc6-8855-ad59a92b9365"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:30.327881 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.327769 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "95b712db-666a-4fc6-8855-ad59a92b9365" (UID: "95b712db-666a-4fc6-8855-ad59a92b9365"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:30.426977 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.426902 2560 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-oauth-config\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:30.426977 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.426935 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9r2jt\" (UniqueName: \"kubernetes.io/projected/95b712db-666a-4fc6-8855-ad59a92b9365-kube-api-access-9r2jt\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:30.426977 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.426945 2560 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b712db-666a-4fc6-8855-ad59a92b9365-console-serving-cert\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:30.426977 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.426956 2560 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b712db-666a-4fc6-8855-ad59a92b9365-service-ca\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:56:30.795921 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.795857 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-689b7596d9-x4xwj_95b712db-666a-4fc6-8855-ad59a92b9365/console/0.log" Apr 16 19:56:30.795921 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.795892 2560 generic.go:358] "Generic (PLEG): container finished" podID="95b712db-666a-4fc6-8855-ad59a92b9365" containerID="1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6" exitCode=2 Apr 16 19:56:30.796094 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.795944 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689b7596d9-x4xwj" event={"ID":"95b712db-666a-4fc6-8855-ad59a92b9365","Type":"ContainerDied","Data":"1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6"} Apr 16 19:56:30.796094 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.795975 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689b7596d9-x4xwj" event={"ID":"95b712db-666a-4fc6-8855-ad59a92b9365","Type":"ContainerDied","Data":"1c4e099f327b8d2ca2025817722bfd5e1e45429ddb84e84a89a07cc174a3290b"} Apr 16 19:56:30.796094 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.795978 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689b7596d9-x4xwj" Apr 16 19:56:30.796094 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.795991 2560 scope.go:117] "RemoveContainer" containerID="1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6" Apr 16 19:56:30.804368 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.804243 2560 scope.go:117] "RemoveContainer" containerID="1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6" Apr 16 19:56:30.804626 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:56:30.804596 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6\": container with ID starting with 1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6 not found: ID does not exist" containerID="1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6" Apr 16 19:56:30.804714 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.804637 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6"} err="failed to get container status \"1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6\": rpc error: code = NotFound desc = could not find container \"1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6\": container with ID starting with 1aa0181360542ffa4f55676091a897f19f79bde0b5cd195543478e5b5772bfc6 not found: ID does not exist" Apr 16 19:56:30.825843 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.825798 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-689b7596d9-x4xwj"] Apr 16 19:56:30.830916 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:30.830890 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-689b7596d9-x4xwj"] Apr 16 19:56:32.222596 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:32.222561 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b712db-666a-4fc6-8855-ad59a92b9365" path="/var/lib/kubelet/pods/95b712db-666a-4fc6-8855-ad59a92b9365/volumes" Apr 16 19:56:49.866953 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:49.866862 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:49.885382 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:49.885356 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:50.871876 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:56:50.871843 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:03.659081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.659045 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:57:03.660441 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.660038 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="alertmanager" containerID="cri-o://a516d2752681bb11eb8091147531dd017bf0e0f9397d46b7995a11ddb63f5a24" gracePeriod=120 Apr 16 19:57:03.660441 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.660282 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy" containerID="cri-o://71c09f84716db30b31ad6f2ab0022048e02cda3b02e83e5c2bcbe9860536b8c4" gracePeriod=120 Apr 16 19:57:03.660441 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.660434 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="prom-label-proxy" containerID="cri-o://04171d5dd0a1498d48fea984620f681adab5bd2c90c2d8d1dac75c048daafb7f" gracePeriod=120 Apr 16 19:57:03.660694 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.660445 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy-web" containerID="cri-o://9a9fddd000d7c761e562b14b28b5ee64b25c5977b546892d70f2c070ae40d58b" gracePeriod=120 Apr 16 19:57:03.660694 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.660455 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="config-reloader" containerID="cri-o://b71d38ea275698060da1b38c2d1fe37fd923f44b990f61e16627c97262185bfe" gracePeriod=120 Apr 16 19:57:03.660694 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.660459 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy-metric" containerID="cri-o://607878292de8f795b95dee98e8e7dd027a5e6d31fb3c5a0bdb3ca766fcadad86" gracePeriod=120 Apr 16 19:57:03.903412 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.903378 2560 generic.go:358] "Generic (PLEG): container finished" podID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerID="04171d5dd0a1498d48fea984620f681adab5bd2c90c2d8d1dac75c048daafb7f" exitCode=0 Apr 16 19:57:03.903412 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.903406 2560 generic.go:358] "Generic (PLEG): container finished" podID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerID="71c09f84716db30b31ad6f2ab0022048e02cda3b02e83e5c2bcbe9860536b8c4" exitCode=0 Apr 16 19:57:03.903412 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.903414 2560 generic.go:358] "Generic (PLEG): container finished" podID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerID="b71d38ea275698060da1b38c2d1fe37fd923f44b990f61e16627c97262185bfe" exitCode=0 Apr 16 19:57:03.903412 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.903421 2560 generic.go:358] "Generic (PLEG): container finished" podID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerID="a516d2752681bb11eb8091147531dd017bf0e0f9397d46b7995a11ddb63f5a24" exitCode=0 Apr 16 19:57:03.903677 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.903439 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerDied","Data":"04171d5dd0a1498d48fea984620f681adab5bd2c90c2d8d1dac75c048daafb7f"} Apr 16 19:57:03.903677 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.903465 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerDied","Data":"71c09f84716db30b31ad6f2ab0022048e02cda3b02e83e5c2bcbe9860536b8c4"} Apr 16 19:57:03.903677 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.903479 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerDied","Data":"b71d38ea275698060da1b38c2d1fe37fd923f44b990f61e16627c97262185bfe"} Apr 16 19:57:03.903677 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:03.903487 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerDied","Data":"a516d2752681bb11eb8091147531dd017bf0e0f9397d46b7995a11ddb63f5a24"} Apr 16 19:57:04.910511 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:04.910483 2560 generic.go:358] "Generic (PLEG): container finished" podID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerID="607878292de8f795b95dee98e8e7dd027a5e6d31fb3c5a0bdb3ca766fcadad86" exitCode=0 Apr 16 19:57:04.910511 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:04.910506 2560 generic.go:358] "Generic (PLEG): container finished" podID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerID="9a9fddd000d7c761e562b14b28b5ee64b25c5977b546892d70f2c070ae40d58b" exitCode=0 Apr 16 19:57:04.910885 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:04.910555 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerDied","Data":"607878292de8f795b95dee98e8e7dd027a5e6d31fb3c5a0bdb3ca766fcadad86"} Apr 16 19:57:04.910885 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:04.910600 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerDied","Data":"9a9fddd000d7c761e562b14b28b5ee64b25c5977b546892d70f2c070ae40d58b"} Apr 16 19:57:04.924059 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:04.924039 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:05.014023 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.013942 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-main-tls\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014023 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.013981 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-main-db\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014023 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014012 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twtgq\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-kube-api-access-twtgq\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014031 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014075 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-out\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014093 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-cluster-tls-config\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014138 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-web\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014170 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014195 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-volume\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014250 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-tls-assets\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014280 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-trusted-ca-bundle\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014331 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014310 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-web-config\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014726 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014355 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:57:05.014726 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014371 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-metrics-client-ca\") pod \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\" (UID: \"315acf74-b875-49c3-9a7f-5cf0e546f3b7\") " Apr 16 19:57:05.014726 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014707 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:05.014878 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.014725 2560 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-main-db\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.016839 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.016808 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:05.016965 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.016838 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:05.017219 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.017194 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:05.017345 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.017244 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-volume" (OuterVolumeSpecName: "config-volume") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:05.017449 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.017427 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-out" (OuterVolumeSpecName: "config-out") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:57:05.017721 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.017698 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:57:05.017721 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.017709 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:05.017821 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.017718 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-kube-api-access-twtgq" (OuterVolumeSpecName: "kube-api-access-twtgq") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "kube-api-access-twtgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:57:05.019177 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.019155 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:05.021632 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.021612 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:05.028769 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.028745 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-web-config" (OuterVolumeSpecName: "web-config") pod "315acf74-b875-49c3-9a7f-5cf0e546f3b7" (UID: "315acf74-b875-49c3-9a7f-5cf0e546f3b7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:05.115525 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115493 2560 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-web-config\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115525 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115521 2560 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-metrics-client-ca\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115525 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115532 2560 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-main-tls\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115542 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twtgq\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-kube-api-access-twtgq\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115554 2560 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115562 2560 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-out\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115571 2560 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-cluster-tls-config\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115586 2560 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115597 2560 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115605 2560 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/315acf74-b875-49c3-9a7f-5cf0e546f3b7-config-volume\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115614 2560 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/315acf74-b875-49c3-9a7f-5cf0e546f3b7-tls-assets\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.115732 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.115623 2560 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/315acf74-b875-49c3-9a7f-5cf0e546f3b7-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:05.916726 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.916691 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"315acf74-b875-49c3-9a7f-5cf0e546f3b7","Type":"ContainerDied","Data":"ea58e065964564be8c30476d0751fdd910519938f8c3850f19bff81c41a2533d"} Apr 16 19:57:05.917187 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.916739 2560 scope.go:117] "RemoveContainer" containerID="04171d5dd0a1498d48fea984620f681adab5bd2c90c2d8d1dac75c048daafb7f" Apr 16 19:57:05.917187 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.916771 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:05.924835 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.924808 2560 scope.go:117] "RemoveContainer" containerID="607878292de8f795b95dee98e8e7dd027a5e6d31fb3c5a0bdb3ca766fcadad86" Apr 16 19:57:05.931343 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.931323 2560 scope.go:117] "RemoveContainer" containerID="71c09f84716db30b31ad6f2ab0022048e02cda3b02e83e5c2bcbe9860536b8c4" Apr 16 19:57:05.937603 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.937587 2560 scope.go:117] "RemoveContainer" containerID="9a9fddd000d7c761e562b14b28b5ee64b25c5977b546892d70f2c070ae40d58b" Apr 16 19:57:05.942672 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.942653 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:57:05.945237 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.945211 2560 scope.go:117] "RemoveContainer" containerID="b71d38ea275698060da1b38c2d1fe37fd923f44b990f61e16627c97262185bfe" Apr 16 19:57:05.948088 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.948064 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:57:05.951950 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.951934 2560 scope.go:117] "RemoveContainer" containerID="a516d2752681bb11eb8091147531dd017bf0e0f9397d46b7995a11ddb63f5a24" Apr 16 19:57:05.957958 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.957941 2560 scope.go:117] "RemoveContainer" containerID="f0684766d67b543df695c8b8f9da36c1f49c620f4e4e123d7d003702aa2bfb3a" Apr 16 19:57:05.973957 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.973937 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:57:05.974260 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974246 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy" Apr 16 19:57:05.974307 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974263 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy" Apr 16 19:57:05.974307 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974273 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="init-config-reloader" Apr 16 19:57:05.974307 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974278 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="init-config-reloader" Apr 16 19:57:05.974307 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974288 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy-web" Apr 16 19:57:05.974307 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974293 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy-web" Apr 16 19:57:05.974307 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974301 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy-metric" Apr 16 19:57:05.974307 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974306 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy-metric" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974312 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="prom-label-proxy" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974317 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="prom-label-proxy" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974327 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="config-reloader" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974332 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="config-reloader" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974339 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="alertmanager" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974345 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="alertmanager" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974350 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95b712db-666a-4fc6-8855-ad59a92b9365" containerName="console" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974355 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b712db-666a-4fc6-8855-ad59a92b9365" containerName="console" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974362 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51d38236-bd05-418d-8307-b0bf54184953" containerName="console" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974366 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d38236-bd05-418d-8307-b0bf54184953" containerName="console" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974409 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="51d38236-bd05-418d-8307-b0bf54184953" containerName="console" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974415 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy-web" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974429 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="95b712db-666a-4fc6-8855-ad59a92b9365" containerName="console" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974434 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="config-reloader" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974441 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974449 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="alertmanager" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974454 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="prom-label-proxy" Apr 16 19:57:05.974533 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.974460 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" containerName="kube-rbac-proxy-metric" Apr 16 19:57:05.980370 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.980350 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:05.983153 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.983104 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-h2rrk\"" Apr 16 19:57:05.983153 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.983101 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:57:05.983320 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.983105 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:57:05.983320 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.983102 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:57:05.983320 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.983210 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:57:05.983320 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.983251 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:57:05.983504 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.983369 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:57:05.983567 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.983543 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:57:05.983616 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.983590 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:57:05.989526 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.989505 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:57:05.993829 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:05.993331 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:57:06.123215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123095 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123215 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123181 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5811c3b-fd59-47fa-8b21-1124a15657a2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123413 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123217 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5811c3b-fd59-47fa-8b21-1124a15657a2-config-out\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123413 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123246 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123413 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123268 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123413 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123291 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123413 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123311 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d5811c3b-fd59-47fa-8b21-1124a15657a2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123413 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123381 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxkbt\" (UniqueName: \"kubernetes.io/projected/d5811c3b-fd59-47fa-8b21-1124a15657a2-kube-api-access-xxkbt\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123666 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123420 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5811c3b-fd59-47fa-8b21-1124a15657a2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123666 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123438 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-config-volume\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123666 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123483 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5811c3b-fd59-47fa-8b21-1124a15657a2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123666 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123511 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.123666 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.123564 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-web-config\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.223299 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.223269 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315acf74-b875-49c3-9a7f-5cf0e546f3b7" path="/var/lib/kubelet/pods/315acf74-b875-49c3-9a7f-5cf0e546f3b7/volumes" Apr 16 19:57:06.224018 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224000 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5811c3b-fd59-47fa-8b21-1124a15657a2-config-out\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224066 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224030 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224066 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224051 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224145 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224079 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224234 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224210 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d5811c3b-fd59-47fa-8b21-1124a15657a2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224290 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224269 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxkbt\" (UniqueName: \"kubernetes.io/projected/d5811c3b-fd59-47fa-8b21-1124a15657a2-kube-api-access-xxkbt\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224343 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224310 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5811c3b-fd59-47fa-8b21-1124a15657a2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224343 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224336 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-config-volume\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224443 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224366 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5811c3b-fd59-47fa-8b21-1124a15657a2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224443 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224394 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224590 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224444 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-web-config\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224590 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224495 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224590 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224554 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5811c3b-fd59-47fa-8b21-1124a15657a2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.224739 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.224615 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d5811c3b-fd59-47fa-8b21-1124a15657a2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.225405 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.225028 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5811c3b-fd59-47fa-8b21-1124a15657a2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.225534 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.225514 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5811c3b-fd59-47fa-8b21-1124a15657a2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.227031 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.227003 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.227244 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.227211 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5811c3b-fd59-47fa-8b21-1124a15657a2-config-out\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.227339 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.227265 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.227703 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.227643 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-web-config\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.227703 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.227643 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.228081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.228058 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-config-volume\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.228172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.228079 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.228172 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.228129 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d5811c3b-fd59-47fa-8b21-1124a15657a2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.229229 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.229210 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5811c3b-fd59-47fa-8b21-1124a15657a2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.232382 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.232366 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxkbt\" (UniqueName: \"kubernetes.io/projected/d5811c3b-fd59-47fa-8b21-1124a15657a2-kube-api-access-xxkbt\") pod \"alertmanager-main-0\" (UID: \"d5811c3b-fd59-47fa-8b21-1124a15657a2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.296192 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.296172 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:57:06.420690 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.420667 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:57:06.422723 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:57:06.422695 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5811c3b_fd59_47fa_8b21_1124a15657a2.slice/crio-9ea26668f36de22492a087edd0a1b179036058b01c07989e0d4109f8cdb390fd WatchSource:0}: Error finding container 9ea26668f36de22492a087edd0a1b179036058b01c07989e0d4109f8cdb390fd: Status 404 returned error can't find the container with id 9ea26668f36de22492a087edd0a1b179036058b01c07989e0d4109f8cdb390fd Apr 16 19:57:06.920517 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.920482 2560 generic.go:358] "Generic (PLEG): container finished" podID="d5811c3b-fd59-47fa-8b21-1124a15657a2" containerID="4bf7f8924ade3e2fca0fe4e11e9edc287653cd6585ceecc615b4e53a72227952" exitCode=0 Apr 16 19:57:06.920902 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.920569 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5811c3b-fd59-47fa-8b21-1124a15657a2","Type":"ContainerDied","Data":"4bf7f8924ade3e2fca0fe4e11e9edc287653cd6585ceecc615b4e53a72227952"} Apr 16 19:57:06.920902 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:06.920604 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5811c3b-fd59-47fa-8b21-1124a15657a2","Type":"ContainerStarted","Data":"9ea26668f36de22492a087edd0a1b179036058b01c07989e0d4109f8cdb390fd"} Apr 16 19:57:07.885723 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.880985 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:57:07.885723 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.881884 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="prometheus" containerID="cri-o://8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" gracePeriod=600 Apr 16 19:57:07.885723 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.882174 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy" containerID="cri-o://622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" gracePeriod=600 Apr 16 19:57:07.885723 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.882241 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="thanos-sidecar" containerID="cri-o://073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" gracePeriod=600 Apr 16 19:57:07.885723 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.882303 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="config-reloader" containerID="cri-o://5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" gracePeriod=600 Apr 16 19:57:07.885723 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.882362 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy-web" containerID="cri-o://2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" gracePeriod=600 Apr 16 19:57:07.885723 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.882350 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy-thanos" containerID="cri-o://b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" gracePeriod=600 Apr 16 19:57:07.930243 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.930216 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5811c3b-fd59-47fa-8b21-1124a15657a2","Type":"ContainerStarted","Data":"5d0726653ffbfb91c5a36b64220ff8a8b2b1e47864bca26cb0362fa86a0be8de"} Apr 16 19:57:07.930576 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.930256 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5811c3b-fd59-47fa-8b21-1124a15657a2","Type":"ContainerStarted","Data":"0484590c6a75c55d32e5f384e739edc74a4942713095ef46cecc62e124e6185e"} Apr 16 19:57:07.930576 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.930270 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5811c3b-fd59-47fa-8b21-1124a15657a2","Type":"ContainerStarted","Data":"2010968efd02a2035b8834e335d43a0098b42010cac4972e9b4bca55f516f027"} Apr 16 19:57:07.930576 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.930282 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5811c3b-fd59-47fa-8b21-1124a15657a2","Type":"ContainerStarted","Data":"64f37f2dac13d8e4df0756af168bebb121f8e097bc7caf35cfbd56783d373d80"} Apr 16 19:57:07.930576 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.930294 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5811c3b-fd59-47fa-8b21-1124a15657a2","Type":"ContainerStarted","Data":"57b561c3e2ebdeb07112624c77d6ab43150694c546d06beab889b7e02bc5b3ce"} Apr 16 19:57:07.930576 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.930307 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5811c3b-fd59-47fa-8b21-1124a15657a2","Type":"ContainerStarted","Data":"b211436abbce07de3bcbba182bc05edd218edf37a37ced82522da7682359d8dd"} Apr 16 19:57:07.961940 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:07.961883 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.961864424 podStartE2EDuration="2.961864424s" podCreationTimestamp="2026-04-16 19:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:57:07.960209645 +0000 UTC m=+194.293591005" watchObservedRunningTime="2026-04-16 19:57:07.961864424 +0000 UTC m=+194.295245746" Apr 16 19:57:08.143308 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.143281 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:08.241121 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241095 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdpj7\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-kube-api-access-jdpj7\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241287 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241145 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-serving-certs-ca-bundle\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241287 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241175 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config-out\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241287 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241194 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-kube-rbac-proxy\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241287 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241209 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-metrics-client-ca\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241287 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241228 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241287 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241273 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-thanos-prometheus-http-client-file\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241303 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-metrics-client-certs\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241344 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241371 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-db\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241403 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-rulefiles-0\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241435 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-trusted-ca-bundle\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241457 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-kubelet-serving-ca-bundle\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241510 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-tls-assets\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241538 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-grpc-tls\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.241584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241568 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.242061 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241603 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:08.242061 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241621 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-tls\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.242061 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241601 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:08.242061 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241653 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-web-config\") pod \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\" (UID: \"131e7fcc-a2a6-4516-bb92-7f51371d0ebc\") " Apr 16 19:57:08.242061 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241929 2560 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.242061 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.241950 2560 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-metrics-client-ca\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.242596 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.242569 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:08.242693 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.242650 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:08.242873 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.242844 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:08.244425 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.244394 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:57:08.245204 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.245072 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config-out" (OuterVolumeSpecName: "config-out") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:57:08.245204 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.245098 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:08.245204 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.245104 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:08.245204 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.245158 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:08.245453 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.245423 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-kube-api-access-jdpj7" (OuterVolumeSpecName: "kube-api-access-jdpj7") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "kube-api-access-jdpj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:57:08.245604 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.245584 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:57:08.245780 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.245756 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:08.245886 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.245823 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config" (OuterVolumeSpecName: "config") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:08.246418 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.246400 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:08.246600 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.246583 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:08.246662 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.246611 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:08.255470 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.255447 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-web-config" (OuterVolumeSpecName: "web-config") pod "131e7fcc-a2a6-4516-bb92-7f51371d0ebc" (UID: "131e7fcc-a2a6-4516-bb92-7f51371d0ebc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:08.342778 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342746 2560 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.342778 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342778 2560 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342792 2560 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-tls-assets\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342806 2560 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-grpc-tls\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342818 2560 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342833 2560 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342845 2560 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-web-config\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342857 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jdpj7\" (UniqueName: \"kubernetes.io/projected/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-kube-api-access-jdpj7\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342869 2560 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config-out\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342881 2560 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-kube-rbac-proxy\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342893 2560 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-config\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342907 2560 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342920 2560 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-metrics-client-certs\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342934 2560 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342949 2560 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-db\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.343007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.342963 2560 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/131e7fcc-a2a6-4516-bb92-7f51371d0ebc-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 19:57:08.937217 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937182 2560 generic.go:358] "Generic (PLEG): container finished" podID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerID="b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" exitCode=0 Apr 16 19:57:08.937217 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937211 2560 generic.go:358] "Generic (PLEG): container finished" podID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerID="622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" exitCode=0 Apr 16 19:57:08.937217 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937219 2560 generic.go:358] "Generic (PLEG): container finished" podID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerID="2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" exitCode=0 Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937228 2560 generic.go:358] "Generic (PLEG): container finished" podID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerID="073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" exitCode=0 Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937235 2560 generic.go:358] "Generic (PLEG): container finished" podID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerID="5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" exitCode=0 Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937242 2560 generic.go:358] "Generic (PLEG): container finished" podID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerID="8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" exitCode=0 Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937265 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerDied","Data":"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e"} Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937313 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerDied","Data":"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8"} Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937327 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerDied","Data":"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf"} Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937337 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerDied","Data":"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491"} Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937346 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerDied","Data":"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd"} Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937355 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerDied","Data":"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29"} Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937366 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"131e7fcc-a2a6-4516-bb92-7f51371d0ebc","Type":"ContainerDied","Data":"9a4336224e8313439c1ef7c4d3a9f0152b6cb49a802f3e1df5a0519c63918459"} Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937312 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:08.937676 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.937326 2560 scope.go:117] "RemoveContainer" containerID="b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" Apr 16 19:57:08.944818 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.944800 2560 scope.go:117] "RemoveContainer" containerID="622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" Apr 16 19:57:08.951843 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.951826 2560 scope.go:117] "RemoveContainer" containerID="2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" Apr 16 19:57:08.957813 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.957796 2560 scope.go:117] "RemoveContainer" containerID="073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" Apr 16 19:57:08.961588 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.961567 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:57:08.964790 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.964770 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:57:08.966134 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.966102 2560 scope.go:117] "RemoveContainer" containerID="5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" Apr 16 19:57:08.972488 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.972473 2560 scope.go:117] "RemoveContainer" containerID="8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" Apr 16 19:57:08.978879 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.978865 2560 scope.go:117] "RemoveContainer" containerID="12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596" Apr 16 19:57:08.985051 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.985037 2560 scope.go:117] "RemoveContainer" containerID="b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" Apr 16 19:57:08.985297 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:57:08.985279 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": container with ID starting with b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e not found: ID does not exist" containerID="b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" Apr 16 19:57:08.985353 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.985306 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e"} err="failed to get container status \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": rpc error: code = NotFound desc = could not find container \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": container with ID starting with b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e not found: ID does not exist" Apr 16 19:57:08.985353 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.985324 2560 scope.go:117] "RemoveContainer" containerID="622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" Apr 16 19:57:08.985572 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:57:08.985554 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": container with ID starting with 622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8 not found: ID does not exist" containerID="622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" Apr 16 19:57:08.985636 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.985594 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8"} err="failed to get container status \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": rpc error: code = NotFound desc = could not find container \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": container with ID starting with 622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8 not found: ID does not exist" Apr 16 19:57:08.985636 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.985619 2560 scope.go:117] "RemoveContainer" containerID="2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" Apr 16 19:57:08.985853 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:57:08.985836 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": container with ID starting with 2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf not found: ID does not exist" containerID="2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" Apr 16 19:57:08.985892 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.985860 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf"} err="failed to get container status \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": rpc error: code = NotFound desc = could not find container \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": container with ID starting with 2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf not found: ID does not exist" Apr 16 19:57:08.985892 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.985874 2560 scope.go:117] "RemoveContainer" containerID="073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" Apr 16 19:57:08.986088 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:57:08.986070 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": container with ID starting with 073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491 not found: ID does not exist" containerID="073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" Apr 16 19:57:08.986182 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986096 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491"} err="failed to get container status \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": rpc error: code = NotFound desc = could not find container \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": container with ID starting with 073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491 not found: ID does not exist" Apr 16 19:57:08.986182 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986129 2560 scope.go:117] "RemoveContainer" containerID="5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" Apr 16 19:57:08.986360 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:57:08.986344 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": container with ID starting with 5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd not found: ID does not exist" containerID="5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" Apr 16 19:57:08.986397 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986364 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd"} err="failed to get container status \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": rpc error: code = NotFound desc = could not find container \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": container with ID starting with 5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd not found: ID does not exist" Apr 16 19:57:08.986397 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986377 2560 scope.go:117] "RemoveContainer" containerID="8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" Apr 16 19:57:08.986550 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:57:08.986534 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": container with ID starting with 8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29 not found: ID does not exist" containerID="8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" Apr 16 19:57:08.986613 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986555 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29"} err="failed to get container status \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": rpc error: code = NotFound desc = could not find container \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": container with ID starting with 8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29 not found: ID does not exist" Apr 16 19:57:08.986613 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986575 2560 scope.go:117] "RemoveContainer" containerID="12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596" Apr 16 19:57:08.986766 ip-10-0-131-77 kubenswrapper[2560]: E0416 19:57:08.986752 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": container with ID starting with 12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596 not found: ID does not exist" containerID="12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596" Apr 16 19:57:08.986805 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986769 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596"} err="failed to get container status \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": rpc error: code = NotFound desc = could not find container \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": container with ID starting with 12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596 not found: ID does not exist" Apr 16 19:57:08.986805 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986780 2560 scope.go:117] "RemoveContainer" containerID="b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" Apr 16 19:57:08.986987 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986968 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e"} err="failed to get container status \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": rpc error: code = NotFound desc = could not find container \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": container with ID starting with b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e not found: ID does not exist" Apr 16 19:57:08.987031 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.986988 2560 scope.go:117] "RemoveContainer" containerID="622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" Apr 16 19:57:08.987205 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.987187 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8"} err="failed to get container status \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": rpc error: code = NotFound desc = could not find container \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": container with ID starting with 622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8 not found: ID does not exist" Apr 16 19:57:08.987245 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.987207 2560 scope.go:117] "RemoveContainer" containerID="2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" Apr 16 19:57:08.987429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.987408 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf"} err="failed to get container status \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": rpc error: code = NotFound desc = could not find container \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": container with ID starting with 2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf not found: ID does not exist" Apr 16 19:57:08.987429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.987427 2560 scope.go:117] "RemoveContainer" containerID="073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" Apr 16 19:57:08.987637 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.987620 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491"} err="failed to get container status \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": rpc error: code = NotFound desc = could not find container \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": container with ID starting with 073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491 not found: ID does not exist" Apr 16 19:57:08.987701 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.987639 2560 scope.go:117] "RemoveContainer" containerID="5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" Apr 16 19:57:08.987865 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.987850 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd"} err="failed to get container status \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": rpc error: code = NotFound desc = could not find container \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": container with ID starting with 5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd not found: ID does not exist" Apr 16 19:57:08.987918 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.987865 2560 scope.go:117] "RemoveContainer" containerID="8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" Apr 16 19:57:08.988053 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.988034 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29"} err="failed to get container status \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": rpc error: code = NotFound desc = could not find container \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": container with ID starting with 8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29 not found: ID does not exist" Apr 16 19:57:08.988145 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.988055 2560 scope.go:117] "RemoveContainer" containerID="12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596" Apr 16 19:57:08.988289 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.988269 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596"} err="failed to get container status \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": rpc error: code = NotFound desc = could not find container \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": container with ID starting with 12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596 not found: ID does not exist" Apr 16 19:57:08.988362 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.988291 2560 scope.go:117] "RemoveContainer" containerID="b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" Apr 16 19:57:08.988713 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.988608 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e"} err="failed to get container status \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": rpc error: code = NotFound desc = could not find container \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": container with ID starting with b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e not found: ID does not exist" Apr 16 19:57:08.988713 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.988636 2560 scope.go:117] "RemoveContainer" containerID="622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" Apr 16 19:57:08.989195 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.989098 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8"} err="failed to get container status \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": rpc error: code = NotFound desc = could not find container \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": container with ID starting with 622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8 not found: ID does not exist" Apr 16 19:57:08.989195 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.989139 2560 scope.go:117] "RemoveContainer" containerID="2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" Apr 16 19:57:08.989602 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.989520 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf"} err="failed to get container status \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": rpc error: code = NotFound desc = could not find container \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": container with ID starting with 2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf not found: ID does not exist" Apr 16 19:57:08.989602 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.989543 2560 scope.go:117] "RemoveContainer" containerID="073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" Apr 16 19:57:08.989929 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.989902 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491"} err="failed to get container status \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": rpc error: code = NotFound desc = could not find container \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": container with ID starting with 073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491 not found: ID does not exist" Apr 16 19:57:08.989929 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.989929 2560 scope.go:117] "RemoveContainer" containerID="5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" Apr 16 19:57:08.990283 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.990242 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd"} err="failed to get container status \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": rpc error: code = NotFound desc = could not find container \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": container with ID starting with 5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd not found: ID does not exist" Apr 16 19:57:08.990283 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.990271 2560 scope.go:117] "RemoveContainer" containerID="8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" Apr 16 19:57:08.990526 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.990498 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29"} err="failed to get container status \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": rpc error: code = NotFound desc = could not find container \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": container with ID starting with 8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29 not found: ID does not exist" Apr 16 19:57:08.990575 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.990527 2560 scope.go:117] "RemoveContainer" containerID="12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596" Apr 16 19:57:08.990763 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.990743 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596"} err="failed to get container status \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": rpc error: code = NotFound desc = could not find container \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": container with ID starting with 12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596 not found: ID does not exist" Apr 16 19:57:08.990834 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.990765 2560 scope.go:117] "RemoveContainer" containerID="b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" Apr 16 19:57:08.990880 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.990839 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:57:08.991007 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.990990 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e"} err="failed to get container status \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": rpc error: code = NotFound desc = could not find container \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": container with ID starting with b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e not found: ID does not exist" Apr 16 19:57:08.991074 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991008 2560 scope.go:117] "RemoveContainer" containerID="622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" Apr 16 19:57:08.991266 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991234 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="config-reloader" Apr 16 19:57:08.991266 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991258 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="config-reloader" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991270 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy-web" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991279 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy-web" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991305 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="thanos-sidecar" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991313 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="thanos-sidecar" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991325 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991334 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991345 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="init-config-reloader" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991352 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="init-config-reloader" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991368 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="prometheus" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991365 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8"} err="failed to get container status \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": rpc error: code = NotFound desc = could not find container \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": container with ID starting with 622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8 not found: ID does not exist" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991386 2560 scope.go:117] "RemoveContainer" containerID="2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" Apr 16 19:57:08.991429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991376 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="prometheus" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991446 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy-thanos" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991452 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy-thanos" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991569 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="thanos-sidecar" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991603 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy-web" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991616 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy-thanos" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991631 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="prometheus" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991647 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="config-reloader" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991657 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" containerName="kube-rbac-proxy" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991676 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf"} err="failed to get container status \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": rpc error: code = NotFound desc = could not find container \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": container with ID starting with 2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf not found: ID does not exist" Apr 16 19:57:08.991872 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991696 2560 scope.go:117] "RemoveContainer" containerID="073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" Apr 16 19:57:08.992272 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991932 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491"} err="failed to get container status \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": rpc error: code = NotFound desc = could not find container \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": container with ID starting with 073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491 not found: ID does not exist" Apr 16 19:57:08.992272 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.991952 2560 scope.go:117] "RemoveContainer" containerID="5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" Apr 16 19:57:08.992272 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.992228 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd"} err="failed to get container status \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": rpc error: code = NotFound desc = could not find container \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": container with ID starting with 5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd not found: ID does not exist" Apr 16 19:57:08.992272 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.992249 2560 scope.go:117] "RemoveContainer" containerID="8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" Apr 16 19:57:08.992505 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.992482 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29"} err="failed to get container status \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": rpc error: code = NotFound desc = could not find container \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": container with ID starting with 8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29 not found: ID does not exist" Apr 16 19:57:08.992584 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.992505 2560 scope.go:117] "RemoveContainer" containerID="12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596" Apr 16 19:57:08.993258 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.993165 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596"} err="failed to get container status \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": rpc error: code = NotFound desc = could not find container \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": container with ID starting with 12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596 not found: ID does not exist" Apr 16 19:57:08.993258 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.993188 2560 scope.go:117] "RemoveContainer" containerID="b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" Apr 16 19:57:08.993517 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.993436 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e"} err="failed to get container status \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": rpc error: code = NotFound desc = could not find container \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": container with ID starting with b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e not found: ID does not exist" Apr 16 19:57:08.993517 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.993459 2560 scope.go:117] "RemoveContainer" containerID="622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" Apr 16 19:57:08.993802 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.993718 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8"} err="failed to get container status \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": rpc error: code = NotFound desc = could not find container \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": container with ID starting with 622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8 not found: ID does not exist" Apr 16 19:57:08.993802 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.993741 2560 scope.go:117] "RemoveContainer" containerID="2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" Apr 16 19:57:08.994066 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.993989 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf"} err="failed to get container status \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": rpc error: code = NotFound desc = could not find container \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": container with ID starting with 2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf not found: ID does not exist" Apr 16 19:57:08.994066 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.994011 2560 scope.go:117] "RemoveContainer" containerID="073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" Apr 16 19:57:08.994357 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.994273 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491"} err="failed to get container status \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": rpc error: code = NotFound desc = could not find container \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": container with ID starting with 073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491 not found: ID does not exist" Apr 16 19:57:08.994357 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.994299 2560 scope.go:117] "RemoveContainer" containerID="5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" Apr 16 19:57:08.994609 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.994530 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd"} err="failed to get container status \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": rpc error: code = NotFound desc = could not find container \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": container with ID starting with 5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd not found: ID does not exist" Apr 16 19:57:08.994609 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.994552 2560 scope.go:117] "RemoveContainer" containerID="8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" Apr 16 19:57:08.994819 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.994794 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29"} err="failed to get container status \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": rpc error: code = NotFound desc = could not find container \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": container with ID starting with 8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29 not found: ID does not exist" Apr 16 19:57:08.994819 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.994820 2560 scope.go:117] "RemoveContainer" containerID="12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596" Apr 16 19:57:08.995056 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.995035 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596"} err="failed to get container status \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": rpc error: code = NotFound desc = could not find container \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": container with ID starting with 12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596 not found: ID does not exist" Apr 16 19:57:08.995153 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.995058 2560 scope.go:117] "RemoveContainer" containerID="b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e" Apr 16 19:57:08.995383 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.995361 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e"} err="failed to get container status \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": rpc error: code = NotFound desc = could not find container \"b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e\": container with ID starting with b40f651ad69eed31dbc3d555ba864bae49ed1a799702aef3d6dcf07046514b1e not found: ID does not exist" Apr 16 19:57:08.995440 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.995385 2560 scope.go:117] "RemoveContainer" containerID="622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8" Apr 16 19:57:08.995631 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.995611 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8"} err="failed to get container status \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": rpc error: code = NotFound desc = could not find container \"622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8\": container with ID starting with 622975d18eebafbdb1467e2f4e493cb6eac8f9824cb63bf441daf7a3d4fa60f8 not found: ID does not exist" Apr 16 19:57:08.995688 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.995631 2560 scope.go:117] "RemoveContainer" containerID="2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf" Apr 16 19:57:08.995860 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.995822 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf"} err="failed to get container status \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": rpc error: code = NotFound desc = could not find container \"2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf\": container with ID starting with 2944433fda1073e9782e6c2e3f26d31b204c62dfc3686a7de555b2d99a7044cf not found: ID does not exist" Apr 16 19:57:08.995860 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.995839 2560 scope.go:117] "RemoveContainer" containerID="073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491" Apr 16 19:57:08.996051 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.996030 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491"} err="failed to get container status \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": rpc error: code = NotFound desc = could not find container \"073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491\": container with ID starting with 073bb57dbcdf4bb23f957fea1e56368017610983ae5e5671ff5191b066b0e491 not found: ID does not exist" Apr 16 19:57:08.996130 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.996054 2560 scope.go:117] "RemoveContainer" containerID="5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd" Apr 16 19:57:08.996310 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.996292 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd"} err="failed to get container status \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": rpc error: code = NotFound desc = could not find container \"5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd\": container with ID starting with 5ff7ebd428fda306f98a8caf5cd226964978e63730dbcfd01d93004e76a09ebd not found: ID does not exist" Apr 16 19:57:08.996376 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.996311 2560 scope.go:117] "RemoveContainer" containerID="8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29" Apr 16 19:57:08.996538 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.996517 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29"} err="failed to get container status \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": rpc error: code = NotFound desc = could not find container \"8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29\": container with ID starting with 8430f6909000ff4b9982759d9e0fdd1bf4c0b6b4f280116d2f1ff44e1784fd29 not found: ID does not exist" Apr 16 19:57:08.996538 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.996537 2560 scope.go:117] "RemoveContainer" containerID="12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596" Apr 16 19:57:08.996849 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.996825 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596"} err="failed to get container status \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": rpc error: code = NotFound desc = could not find container \"12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596\": container with ID starting with 12ec6549891f4179699d1ec9aaf477b9647898b2de5b20ac8f7e1e0e681e6596 not found: ID does not exist" Apr 16 19:57:08.997960 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:08.997944 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.000623 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.000602 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:57:09.000723 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.000656 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:57:09.000723 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.000680 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:57:09.000942 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.000925 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:57:09.000942 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.000934 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:57:09.001198 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.001183 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:57:09.001313 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.001299 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:57:09.001393 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.001378 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-b61u7l559ts5g\"" Apr 16 19:57:09.001488 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.001474 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:57:09.001610 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.001598 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:57:09.001665 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.001645 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:57:09.001890 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.001875 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:57:09.001934 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.001882 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sbnfd\"" Apr 16 19:57:09.004663 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.004645 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:57:09.013612 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.010518 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:57:09.013612 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.012568 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:57:09.048391 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048362 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048391 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048391 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-config-out\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048601 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048410 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mplm5\" (UniqueName: \"kubernetes.io/projected/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-kube-api-access-mplm5\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048601 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048471 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048601 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048574 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048744 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048626 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048744 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048648 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-config\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048744 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048663 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048744 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048678 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048744 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048705 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048969 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048803 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048969 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048885 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.048969 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048930 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-web-config\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.049087 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.048985 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.049087 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.049026 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.049087 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.049073 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.049225 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.049150 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.049259 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.049240 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.149932 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.149884 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.149932 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.149943 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.149962 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-config-out\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.149979 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mplm5\" (UniqueName: \"kubernetes.io/projected/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-kube-api-access-mplm5\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.149999 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150015 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150040 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150068 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-config\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150092 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150139 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150167 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150191 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150191 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150648 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150232 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150648 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150262 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-web-config\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150648 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150286 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150648 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150306 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150648 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150339 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150648 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150368 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150972 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150716 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.150972 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.150834 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.151072 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.151001 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.153913 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.153350 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.153913 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.153353 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.153913 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.153587 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-config\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.153913 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.153699 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.153913 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.153355 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.153913 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.153891 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.154323 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.153988 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.154323 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.154023 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.154429 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.154344 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.154618 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.154595 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-config-out\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.155484 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.155458 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-web-config\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.155735 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.155716 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.156006 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.155987 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.156526 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.156505 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.159187 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.159169 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mplm5\" (UniqueName: \"kubernetes.io/projected/eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf-kube-api-access-mplm5\") pod \"prometheus-k8s-0\" (UID: \"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.311709 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.311638 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:09.432462 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.432437 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:57:09.434966 ip-10-0-131-77 kubenswrapper[2560]: W0416 19:57:09.434939 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb3e1c48_92d5_41bb_92b4_7c99a14cb4bf.slice/crio-c7ede8553fc07c881f3a10cf17da3a95fdee4d00998b70ec9c556dd4170964e2 WatchSource:0}: Error finding container c7ede8553fc07c881f3a10cf17da3a95fdee4d00998b70ec9c556dd4170964e2: Status 404 returned error can't find the container with id c7ede8553fc07c881f3a10cf17da3a95fdee4d00998b70ec9c556dd4170964e2 Apr 16 19:57:09.941288 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.941255 2560 generic.go:358] "Generic (PLEG): container finished" podID="eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf" containerID="0263f30893fe6c946ce69462ab3d0a9a2eb7662efc409cd9f2f095c096e72a87" exitCode=0 Apr 16 19:57:09.941703 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.941333 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf","Type":"ContainerDied","Data":"0263f30893fe6c946ce69462ab3d0a9a2eb7662efc409cd9f2f095c096e72a87"} Apr 16 19:57:09.941703 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:09.941367 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf","Type":"ContainerStarted","Data":"c7ede8553fc07c881f3a10cf17da3a95fdee4d00998b70ec9c556dd4170964e2"} Apr 16 19:57:10.225579 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:10.225431 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131e7fcc-a2a6-4516-bb92-7f51371d0ebc" path="/var/lib/kubelet/pods/131e7fcc-a2a6-4516-bb92-7f51371d0ebc/volumes" Apr 16 19:57:10.947666 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:10.947631 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf","Type":"ContainerStarted","Data":"780a7a5502217c63531e0e25f315449abdde0d6066c0ac0daf7daaf469e49ecc"} Apr 16 19:57:10.948081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:10.947667 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf","Type":"ContainerStarted","Data":"a07d6326c6de7bfff1b8366b99a2f9c16271445d1f441330e947b85d1c817bc9"} Apr 16 19:57:10.948081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:10.948011 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf","Type":"ContainerStarted","Data":"a34fd44e9e41df479b7724ce73363a5d9e05bb97a729fd563bcebdc3c84b44c4"} Apr 16 19:57:10.948081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:10.948040 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf","Type":"ContainerStarted","Data":"6aea373d07b64f977d14fd92de5494839120c6560756de5a766a99d4307db65e"} Apr 16 19:57:10.948081 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:10.948069 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf","Type":"ContainerStarted","Data":"0ecf4344635809c084902a6f608b861bf5e297a06a324ae2d0f78b62d7e5f8c8"} Apr 16 19:57:10.948314 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:10.948084 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf","Type":"ContainerStarted","Data":"db4f39c7934a9def66691fd1fbc24566619727e5d1ff69e9d4d49ada70b1e57b"} Apr 16 19:57:10.977886 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:10.977828 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.977810662 podStartE2EDuration="2.977810662s" podCreationTimestamp="2026-04-16 19:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:57:10.976817077 +0000 UTC m=+197.310198421" watchObservedRunningTime="2026-04-16 19:57:10.977810662 +0000 UTC m=+197.311191985" Apr 16 19:57:14.312665 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:57:14.312631 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:58:09.312516 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:58:09.312475 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:58:09.327718 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:58:09.327688 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:58:10.129590 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:58:10.129562 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:58:54.117750 ip-10-0-131-77 kubenswrapper[2560]: I0416 19:58:54.117723 2560 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:00:01.307703 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.307623 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-gvx4k"] Apr 16 20:00:01.310743 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.310726 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:00:01.313736 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.313702 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 20:00:01.313736 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.313705 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 20:00:01.313907 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.313748 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-s6clc\"" Apr 16 20:00:01.320478 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.320456 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-gvx4k"] Apr 16 20:00:01.466387 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.466346 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8wxz\" (UniqueName: \"kubernetes.io/projected/6bde1ed2-23c1-42d8-a6e4-41b91a5f7999-kube-api-access-v8wxz\") pod \"cert-manager-webhook-597b96b99b-gvx4k\" (UID: \"6bde1ed2-23c1-42d8-a6e4-41b91a5f7999\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:00:01.466562 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.466397 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6bde1ed2-23c1-42d8-a6e4-41b91a5f7999-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-gvx4k\" (UID: \"6bde1ed2-23c1-42d8-a6e4-41b91a5f7999\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:00:01.567519 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.567435 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8wxz\" (UniqueName: \"kubernetes.io/projected/6bde1ed2-23c1-42d8-a6e4-41b91a5f7999-kube-api-access-v8wxz\") pod \"cert-manager-webhook-597b96b99b-gvx4k\" (UID: \"6bde1ed2-23c1-42d8-a6e4-41b91a5f7999\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:00:01.567519 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.567493 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6bde1ed2-23c1-42d8-a6e4-41b91a5f7999-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-gvx4k\" (UID: \"6bde1ed2-23c1-42d8-a6e4-41b91a5f7999\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:00:01.575810 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.575781 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6bde1ed2-23c1-42d8-a6e4-41b91a5f7999-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-gvx4k\" (UID: \"6bde1ed2-23c1-42d8-a6e4-41b91a5f7999\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:00:01.575973 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.575901 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8wxz\" (UniqueName: \"kubernetes.io/projected/6bde1ed2-23c1-42d8-a6e4-41b91a5f7999-kube-api-access-v8wxz\") pod \"cert-manager-webhook-597b96b99b-gvx4k\" (UID: \"6bde1ed2-23c1-42d8-a6e4-41b91a5f7999\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:00:01.628474 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.628439 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:00:01.711870 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.711832 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-9d4hw"] Apr 16 20:00:01.716674 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.716651 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" Apr 16 20:00:01.720082 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.719998 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-x9f7n\"" Apr 16 20:00:01.732492 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.730991 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-9d4hw"] Apr 16 20:00:01.756430 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.756406 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-gvx4k"] Apr 16 20:00:01.758894 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:00:01.758865 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bde1ed2_23c1_42d8_a6e4_41b91a5f7999.slice/crio-a899b043c8d547786e896e058749b825d2f56d3ffd2af06f50e3f690cf9f022f WatchSource:0}: Error finding container a899b043c8d547786e896e058749b825d2f56d3ffd2af06f50e3f690cf9f022f: Status 404 returned error can't find the container with id a899b043c8d547786e896e058749b825d2f56d3ffd2af06f50e3f690cf9f022f Apr 16 20:00:01.760696 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.760678 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:00:01.871236 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.871138 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkp5d\" (UniqueName: \"kubernetes.io/projected/a38e7c01-acc2-4608-9c5c-65cfbdc35497-kube-api-access-tkp5d\") pod \"cert-manager-cainjector-8966b78d4-9d4hw\" (UID: \"a38e7c01-acc2-4608-9c5c-65cfbdc35497\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" Apr 16 20:00:01.871388 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.871248 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a38e7c01-acc2-4608-9c5c-65cfbdc35497-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-9d4hw\" (UID: \"a38e7c01-acc2-4608-9c5c-65cfbdc35497\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" Apr 16 20:00:01.972455 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.972416 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a38e7c01-acc2-4608-9c5c-65cfbdc35497-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-9d4hw\" (UID: \"a38e7c01-acc2-4608-9c5c-65cfbdc35497\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" Apr 16 20:00:01.972614 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.972499 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkp5d\" (UniqueName: \"kubernetes.io/projected/a38e7c01-acc2-4608-9c5c-65cfbdc35497-kube-api-access-tkp5d\") pod \"cert-manager-cainjector-8966b78d4-9d4hw\" (UID: \"a38e7c01-acc2-4608-9c5c-65cfbdc35497\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" Apr 16 20:00:01.982649 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.982616 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a38e7c01-acc2-4608-9c5c-65cfbdc35497-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-9d4hw\" (UID: \"a38e7c01-acc2-4608-9c5c-65cfbdc35497\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" Apr 16 20:00:01.982870 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:01.982850 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkp5d\" (UniqueName: \"kubernetes.io/projected/a38e7c01-acc2-4608-9c5c-65cfbdc35497-kube-api-access-tkp5d\") pod \"cert-manager-cainjector-8966b78d4-9d4hw\" (UID: \"a38e7c01-acc2-4608-9c5c-65cfbdc35497\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" Apr 16 20:00:02.035053 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:02.035016 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" Apr 16 20:00:02.173028 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:02.172999 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-9d4hw"] Apr 16 20:00:02.174685 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:00:02.174662 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda38e7c01_acc2_4608_9c5c_65cfbdc35497.slice/crio-c2d8941b6a641c571bcc3f7aa3c8671d0fe24f277d89c708b13eafde26f7fa37 WatchSource:0}: Error finding container c2d8941b6a641c571bcc3f7aa3c8671d0fe24f277d89c708b13eafde26f7fa37: Status 404 returned error can't find the container with id c2d8941b6a641c571bcc3f7aa3c8671d0fe24f277d89c708b13eafde26f7fa37 Apr 16 20:00:02.426351 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:02.426269 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" event={"ID":"6bde1ed2-23c1-42d8-a6e4-41b91a5f7999","Type":"ContainerStarted","Data":"a899b043c8d547786e896e058749b825d2f56d3ffd2af06f50e3f690cf9f022f"} Apr 16 20:00:02.427310 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:02.427286 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" event={"ID":"a38e7c01-acc2-4608-9c5c-65cfbdc35497","Type":"ContainerStarted","Data":"c2d8941b6a641c571bcc3f7aa3c8671d0fe24f277d89c708b13eafde26f7fa37"} Apr 16 20:00:08.445786 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:08.445748 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" event={"ID":"a38e7c01-acc2-4608-9c5c-65cfbdc35497","Type":"ContainerStarted","Data":"3e1287c90b5d5136632aeab1a5727dc4ba1b6b34eaca4f144a1d7516fc115e3d"} Apr 16 20:00:08.463353 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:08.463303 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-9d4hw" podStartSLOduration=1.41165651 podStartE2EDuration="7.463290422s" podCreationTimestamp="2026-04-16 20:00:01 +0000 UTC" firstStartedPulling="2026-04-16 20:00:02.176510509 +0000 UTC m=+368.509891824" lastFinishedPulling="2026-04-16 20:00:08.228144437 +0000 UTC m=+374.561525736" observedRunningTime="2026-04-16 20:00:08.461659727 +0000 UTC m=+374.795041048" watchObservedRunningTime="2026-04-16 20:00:08.463290422 +0000 UTC m=+374.796671743" Apr 16 20:00:08.897069 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:08.897036 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf"] Apr 16 20:00:08.899312 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:08.899296 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" Apr 16 20:00:08.902006 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:08.901967 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:00:08.902006 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:08.901969 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-v8djf\"" Apr 16 20:00:08.902196 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:08.902013 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 20:00:08.907484 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:08.907463 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf"] Apr 16 20:00:09.036685 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:09.036653 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a-tmp\") pod \"openshift-lws-operator-bfc7f696d-xnmhf\" (UID: \"ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" Apr 16 20:00:09.036685 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:09.036689 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twdbr\" (UniqueName: \"kubernetes.io/projected/ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a-kube-api-access-twdbr\") pod \"openshift-lws-operator-bfc7f696d-xnmhf\" (UID: \"ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" Apr 16 20:00:09.138190 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:09.138158 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a-tmp\") pod \"openshift-lws-operator-bfc7f696d-xnmhf\" (UID: \"ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" Apr 16 20:00:09.138319 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:09.138197 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twdbr\" (UniqueName: \"kubernetes.io/projected/ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a-kube-api-access-twdbr\") pod \"openshift-lws-operator-bfc7f696d-xnmhf\" (UID: \"ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" Apr 16 20:00:09.138609 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:09.138590 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a-tmp\") pod \"openshift-lws-operator-bfc7f696d-xnmhf\" (UID: \"ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" Apr 16 20:00:09.148139 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:09.148063 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twdbr\" (UniqueName: \"kubernetes.io/projected/ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a-kube-api-access-twdbr\") pod \"openshift-lws-operator-bfc7f696d-xnmhf\" (UID: \"ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" Apr 16 20:00:09.209065 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:09.209038 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" Apr 16 20:00:09.328564 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:09.328496 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf"] Apr 16 20:00:09.330603 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:00:09.330561 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca7d0703_b6b2_4c92_af9f_e3cb9bf9700a.slice/crio-d823baa8838ca3252d7e9ed6f14fab257ebdbf698c01ef362b134aa73ac774da WatchSource:0}: Error finding container d823baa8838ca3252d7e9ed6f14fab257ebdbf698c01ef362b134aa73ac774da: Status 404 returned error can't find the container with id d823baa8838ca3252d7e9ed6f14fab257ebdbf698c01ef362b134aa73ac774da Apr 16 20:00:09.450049 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:09.450019 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" event={"ID":"ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a","Type":"ContainerStarted","Data":"d823baa8838ca3252d7e9ed6f14fab257ebdbf698c01ef362b134aa73ac774da"} Apr 16 20:00:12.460023 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:12.459985 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" event={"ID":"ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a","Type":"ContainerStarted","Data":"cb98e3517cb19b5d5ee1145fcc09dfbcab185722253c6eb757207dc47687e5f6"} Apr 16 20:00:12.479824 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:00:12.479782 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xnmhf" podStartSLOduration=1.986389415 podStartE2EDuration="4.479769157s" podCreationTimestamp="2026-04-16 20:00:08 +0000 UTC" firstStartedPulling="2026-04-16 20:00:09.331957562 +0000 UTC m=+375.665338861" lastFinishedPulling="2026-04-16 20:00:11.825337294 +0000 UTC m=+378.158718603" observedRunningTime="2026-04-16 20:00:12.47769784 +0000 UTC m=+378.811079161" watchObservedRunningTime="2026-04-16 20:00:12.479769157 +0000 UTC m=+378.813150478" Apr 16 20:01:02.179639 ip-10-0-131-77 kubenswrapper[2560]: E0416 20:01:02.179540 2560 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671: reading manifest sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671 in registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671" Apr 16 20:01:02.180038 ip-10-0-131-77 kubenswrapper[2560]: E0416 20:01:02.179774 2560 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:cert-manager-webhook,Image:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671,Command:[/app/cmd/webhook/webhook],Args:[--dynamic-serving-ca-secret-name=cert-manager-webhook-ca --dynamic-serving-ca-secret-namespace=$(POD_NAMESPACE) --dynamic-serving-dns-names=cert-manager-webhook,cert-manager-webhook.$(POD_NAMESPACE),cert-manager-webhook.$(POD_NAMESPACE).svc --secure-port=10250 --v=2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:10250,Protocol:TCP,HostIP:,},ContainerPort{Name:healthcheck,HostPort:0,ContainerPort:6080,Protocol:TCP,HostIP:,},ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:9402,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bound-sa-token,ReadOnly:true,MountPath:/var/run/secrets/openshift/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v8wxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{1 0 healthcheck},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:60,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{1 0 healthcheck},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-webhook-597b96b99b-gvx4k_cert-manager(6bde1ed2-23c1-42d8-a6e4-41b91a5f7999): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671: reading manifest sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671 in registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 16 20:01:02.180956 ip-10-0-131-77 kubenswrapper[2560]: E0416 20:01:02.180925 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-webhook\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671: reading manifest sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671 in registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" podUID="6bde1ed2-23c1-42d8-a6e4-41b91a5f7999" Apr 16 20:01:03.610460 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:03.610419 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" event={"ID":"6bde1ed2-23c1-42d8-a6e4-41b91a5f7999","Type":"ContainerStarted","Data":"671b713fc24c6bacf448d81b926b7a7d281da14cb012ff46220114dbf995e5bc"} Apr 16 20:01:03.610935 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:03.610615 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:01:03.629122 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:03.629066 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" podStartSLOduration=-9223371974.225723 podStartE2EDuration="1m2.629052314s" podCreationTimestamp="2026-04-16 20:00:01 +0000 UTC" firstStartedPulling="2026-04-16 20:00:01.760833924 +0000 UTC m=+368.094215223" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:01:03.62761139 +0000 UTC m=+429.960992703" watchObservedRunningTime="2026-04-16 20:01:03.629052314 +0000 UTC m=+429.962433634" Apr 16 20:01:09.616601 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:09.616567 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-gvx4k" Apr 16 20:01:39.293318 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.293281 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb"] Apr 16 20:01:39.296563 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.296546 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.301631 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.301611 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 20:01:39.301631 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.301622 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bs2lj\"" Apr 16 20:01:39.301631 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.301622 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 20:01:39.301883 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.301669 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 20:01:39.314350 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.314329 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb"] Apr 16 20:01:39.401117 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.401074 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnpn\" (UniqueName: \"kubernetes.io/projected/bacd7afe-b6b7-4f06-8ece-63857ab75f80-kube-api-access-qjnpn\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.401277 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.401149 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bacd7afe-b6b7-4f06-8ece-63857ab75f80-manager-config\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.401277 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.401198 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bacd7afe-b6b7-4f06-8ece-63857ab75f80-cert\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.401277 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.401232 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bacd7afe-b6b7-4f06-8ece-63857ab75f80-metrics-cert\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.502413 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.502380 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bacd7afe-b6b7-4f06-8ece-63857ab75f80-metrics-cert\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.502549 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.502428 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnpn\" (UniqueName: \"kubernetes.io/projected/bacd7afe-b6b7-4f06-8ece-63857ab75f80-kube-api-access-qjnpn\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.502549 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.502458 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bacd7afe-b6b7-4f06-8ece-63857ab75f80-manager-config\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.502549 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.502496 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bacd7afe-b6b7-4f06-8ece-63857ab75f80-cert\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.503205 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.503173 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bacd7afe-b6b7-4f06-8ece-63857ab75f80-manager-config\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.504849 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.504827 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bacd7afe-b6b7-4f06-8ece-63857ab75f80-cert\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.504957 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.504850 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bacd7afe-b6b7-4f06-8ece-63857ab75f80-metrics-cert\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.511036 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.511010 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnpn\" (UniqueName: \"kubernetes.io/projected/bacd7afe-b6b7-4f06-8ece-63857ab75f80-kube-api-access-qjnpn\") pod \"lws-controller-manager-ddc57ffc5-n74kb\" (UID: \"bacd7afe-b6b7-4f06-8ece-63857ab75f80\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.606154 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.606046 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:39.740074 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:39.740044 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb"] Apr 16 20:01:39.743262 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:01:39.743231 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbacd7afe_b6b7_4f06_8ece_63857ab75f80.slice/crio-e53f9bf7cee572a7c00e957088cca427d37a4597482971f19608f1b34f7008ac WatchSource:0}: Error finding container e53f9bf7cee572a7c00e957088cca427d37a4597482971f19608f1b34f7008ac: Status 404 returned error can't find the container with id e53f9bf7cee572a7c00e957088cca427d37a4597482971f19608f1b34f7008ac Apr 16 20:01:40.728845 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:40.728805 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" event={"ID":"bacd7afe-b6b7-4f06-8ece-63857ab75f80","Type":"ContainerStarted","Data":"e53f9bf7cee572a7c00e957088cca427d37a4597482971f19608f1b34f7008ac"} Apr 16 20:01:41.733525 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:41.733487 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" event={"ID":"bacd7afe-b6b7-4f06-8ece-63857ab75f80","Type":"ContainerStarted","Data":"3942fa37935b6b5d624cbcdd49fa987664a209a9150ddbffac6ad63889b729f3"} Apr 16 20:01:41.733988 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:41.733603 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:41.765553 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:41.765486 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" podStartSLOduration=1.539671615 podStartE2EDuration="2.765471106s" podCreationTimestamp="2026-04-16 20:01:39 +0000 UTC" firstStartedPulling="2026-04-16 20:01:39.745524994 +0000 UTC m=+466.078906307" lastFinishedPulling="2026-04-16 20:01:40.971324494 +0000 UTC m=+467.304705798" observedRunningTime="2026-04-16 20:01:41.763093486 +0000 UTC m=+468.096474851" watchObservedRunningTime="2026-04-16 20:01:41.765471106 +0000 UTC m=+468.098852427" Apr 16 20:01:46.382361 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.382327 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj"] Apr 16 20:01:46.385586 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.385571 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" Apr 16 20:01:46.388358 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.388338 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:01:46.388478 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.388443 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:01:46.388478 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.388443 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 20:01:46.389358 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.389344 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-j5n6w\"" Apr 16 20:01:46.396068 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.396048 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj"] Apr 16 20:01:46.462453 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.462426 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv45j\" (UniqueName: \"kubernetes.io/projected/343e342c-4b0c-47e4-8d65-800f3ca262b7-kube-api-access-hv45j\") pod \"dns-operator-controller-manager-844548ff4c-rdkhj\" (UID: \"343e342c-4b0c-47e4-8d65-800f3ca262b7\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" Apr 16 20:01:46.471206 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.471178 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n"] Apr 16 20:01:46.474431 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.474411 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.477034 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.477014 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 20:01:46.477034 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.477031 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 20:01:46.477198 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.477035 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qrnwl\"" Apr 16 20:01:46.484907 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.484887 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n"] Apr 16 20:01:46.563564 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.563533 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-xq84n\" (UID: \"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.563711 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.563569 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nn7j\" (UniqueName: \"kubernetes.io/projected/6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76-kube-api-access-9nn7j\") pod \"kuadrant-console-plugin-6c886788f8-xq84n\" (UID: \"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.563711 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.563634 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-xq84n\" (UID: \"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.563711 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.563660 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv45j\" (UniqueName: \"kubernetes.io/projected/343e342c-4b0c-47e4-8d65-800f3ca262b7-kube-api-access-hv45j\") pod \"dns-operator-controller-manager-844548ff4c-rdkhj\" (UID: \"343e342c-4b0c-47e4-8d65-800f3ca262b7\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" Apr 16 20:01:46.572611 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.572587 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv45j\" (UniqueName: \"kubernetes.io/projected/343e342c-4b0c-47e4-8d65-800f3ca262b7-kube-api-access-hv45j\") pod \"dns-operator-controller-manager-844548ff4c-rdkhj\" (UID: \"343e342c-4b0c-47e4-8d65-800f3ca262b7\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" Apr 16 20:01:46.664046 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.664019 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-xq84n\" (UID: \"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.664190 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.664054 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nn7j\" (UniqueName: \"kubernetes.io/projected/6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76-kube-api-access-9nn7j\") pod \"kuadrant-console-plugin-6c886788f8-xq84n\" (UID: \"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.664243 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.664192 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-xq84n\" (UID: \"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.664743 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.664725 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-xq84n\" (UID: \"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.666373 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.666350 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-xq84n\" (UID: \"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.672053 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.672029 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nn7j\" (UniqueName: \"kubernetes.io/projected/6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76-kube-api-access-9nn7j\") pod \"kuadrant-console-plugin-6c886788f8-xq84n\" (UID: \"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.696947 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.696913 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" Apr 16 20:01:46.783719 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.783687 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" Apr 16 20:01:46.843564 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.843490 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj"] Apr 16 20:01:46.846261 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:01:46.846224 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343e342c_4b0c_47e4_8d65_800f3ca262b7.slice/crio-0196fdc6eed638f0a10784e1ed433822d29d440439c1d34aa51663ad9b4bcd14 WatchSource:0}: Error finding container 0196fdc6eed638f0a10784e1ed433822d29d440439c1d34aa51663ad9b4bcd14: Status 404 returned error can't find the container with id 0196fdc6eed638f0a10784e1ed433822d29d440439c1d34aa51663ad9b4bcd14 Apr 16 20:01:46.916663 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:46.916521 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n"] Apr 16 20:01:47.755870 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:47.755831 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" event={"ID":"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76","Type":"ContainerStarted","Data":"07ef1019eab697642e4b8cd800ede38df6fa1561d90d248ae53a03b342a08445"} Apr 16 20:01:47.757545 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:47.757515 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" event={"ID":"343e342c-4b0c-47e4-8d65-800f3ca262b7","Type":"ContainerStarted","Data":"0196fdc6eed638f0a10784e1ed433822d29d440439c1d34aa51663ad9b4bcd14"} Apr 16 20:01:52.739860 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:52.739832 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-n74kb" Apr 16 20:01:52.778899 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:52.778863 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" event={"ID":"343e342c-4b0c-47e4-8d65-800f3ca262b7","Type":"ContainerStarted","Data":"09ebe3d2965c4f8c57a99b309e1d9d00caf3eb3e5acea085e6a480aef60008c8"} Apr 16 20:01:52.778899 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:52.778909 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" Apr 16 20:01:52.780285 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:52.780253 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" event={"ID":"6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76","Type":"ContainerStarted","Data":"f90ae9754bbc9938ff51a43f069b022b8370456791edbb79641792f87a14b032"} Apr 16 20:01:52.798899 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:52.798846 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" podStartSLOduration=1.5794820569999999 podStartE2EDuration="6.798832577s" podCreationTimestamp="2026-04-16 20:01:46 +0000 UTC" firstStartedPulling="2026-04-16 20:01:46.848443622 +0000 UTC m=+473.181824925" lastFinishedPulling="2026-04-16 20:01:52.067794144 +0000 UTC m=+478.401175445" observedRunningTime="2026-04-16 20:01:52.797945489 +0000 UTC m=+479.131326811" watchObservedRunningTime="2026-04-16 20:01:52.798832577 +0000 UTC m=+479.132213897" Apr 16 20:01:52.814327 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:01:52.814285 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-xq84n" podStartSLOduration=1.595838461 podStartE2EDuration="6.814270908s" podCreationTimestamp="2026-04-16 20:01:46 +0000 UTC" firstStartedPulling="2026-04-16 20:01:46.925238743 +0000 UTC m=+473.258620042" lastFinishedPulling="2026-04-16 20:01:52.143671187 +0000 UTC m=+478.477052489" observedRunningTime="2026-04-16 20:01:52.812567012 +0000 UTC m=+479.145948333" watchObservedRunningTime="2026-04-16 20:01:52.814270908 +0000 UTC m=+479.147652229" Apr 16 20:02:03.785815 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:03.785787 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rdkhj" Apr 16 20:02:27.165811 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.165765 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-t4l54"] Apr 16 20:02:27.191963 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.191930 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-t4l54"] Apr 16 20:02:27.192128 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.192058 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:27.195856 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.195835 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 20:02:27.261209 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.261176 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-t4l54"] Apr 16 20:02:27.283147 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.283101 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvx7\" (UniqueName: \"kubernetes.io/projected/25186d54-ca56-4257-ae36-93c55f98e737-kube-api-access-wrvx7\") pod \"limitador-limitador-64c8f475fb-t4l54\" (UID: \"25186d54-ca56-4257-ae36-93c55f98e737\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:27.283317 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.283187 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/25186d54-ca56-4257-ae36-93c55f98e737-config-file\") pod \"limitador-limitador-64c8f475fb-t4l54\" (UID: \"25186d54-ca56-4257-ae36-93c55f98e737\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:27.384516 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.384485 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvx7\" (UniqueName: \"kubernetes.io/projected/25186d54-ca56-4257-ae36-93c55f98e737-kube-api-access-wrvx7\") pod \"limitador-limitador-64c8f475fb-t4l54\" (UID: \"25186d54-ca56-4257-ae36-93c55f98e737\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:27.384663 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.384532 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/25186d54-ca56-4257-ae36-93c55f98e737-config-file\") pod \"limitador-limitador-64c8f475fb-t4l54\" (UID: \"25186d54-ca56-4257-ae36-93c55f98e737\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:27.385190 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.385173 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/25186d54-ca56-4257-ae36-93c55f98e737-config-file\") pod \"limitador-limitador-64c8f475fb-t4l54\" (UID: \"25186d54-ca56-4257-ae36-93c55f98e737\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:27.393192 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.393170 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvx7\" (UniqueName: \"kubernetes.io/projected/25186d54-ca56-4257-ae36-93c55f98e737-kube-api-access-wrvx7\") pod \"limitador-limitador-64c8f475fb-t4l54\" (UID: \"25186d54-ca56-4257-ae36-93c55f98e737\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:27.502236 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.502159 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:27.622625 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.622592 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-t4l54"] Apr 16 20:02:27.626331 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:02:27.626293 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25186d54_ca56_4257_ae36_93c55f98e737.slice/crio-2c592588f99780ca7304f7adf885b7a998555522631d71741829b4abce987c7c WatchSource:0}: Error finding container 2c592588f99780ca7304f7adf885b7a998555522631d71741829b4abce987c7c: Status 404 returned error can't find the container with id 2c592588f99780ca7304f7adf885b7a998555522631d71741829b4abce987c7c Apr 16 20:02:27.897226 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.897142 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" event={"ID":"25186d54-ca56-4257-ae36-93c55f98e737","Type":"ContainerStarted","Data":"2c592588f99780ca7304f7adf885b7a998555522631d71741829b4abce987c7c"} Apr 16 20:02:27.982146 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.982099 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-cgdnl"] Apr 16 20:02:27.986619 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.986604 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-cgdnl" Apr 16 20:02:27.989497 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.989357 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-7xs7p\"" Apr 16 20:02:27.994340 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:27.994313 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-cgdnl"] Apr 16 20:02:28.089857 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.089822 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvkc9\" (UniqueName: \"kubernetes.io/projected/30b55c50-f06b-4a51-8de7-c24f91d4d4be-kube-api-access-zvkc9\") pod \"authorino-674b59b84c-cgdnl\" (UID: \"30b55c50-f06b-4a51-8de7-c24f91d4d4be\") " pod="kuadrant-system/authorino-674b59b84c-cgdnl" Apr 16 20:02:28.163192 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.163160 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8vc59"] Apr 16 20:02:28.166451 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.166434 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8vc59" Apr 16 20:02:28.170484 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.170461 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8vc59"] Apr 16 20:02:28.190397 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.190365 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvkc9\" (UniqueName: \"kubernetes.io/projected/30b55c50-f06b-4a51-8de7-c24f91d4d4be-kube-api-access-zvkc9\") pod \"authorino-674b59b84c-cgdnl\" (UID: \"30b55c50-f06b-4a51-8de7-c24f91d4d4be\") " pod="kuadrant-system/authorino-674b59b84c-cgdnl" Apr 16 20:02:28.190532 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.190427 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkxkj\" (UniqueName: \"kubernetes.io/projected/4b900d2b-9062-4c36-8c87-ab82f048b6d6-kube-api-access-mkxkj\") pod \"authorino-79cbc94b89-8vc59\" (UID: \"4b900d2b-9062-4c36-8c87-ab82f048b6d6\") " pod="kuadrant-system/authorino-79cbc94b89-8vc59" Apr 16 20:02:28.198614 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.198588 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvkc9\" (UniqueName: \"kubernetes.io/projected/30b55c50-f06b-4a51-8de7-c24f91d4d4be-kube-api-access-zvkc9\") pod \"authorino-674b59b84c-cgdnl\" (UID: \"30b55c50-f06b-4a51-8de7-c24f91d4d4be\") " pod="kuadrant-system/authorino-674b59b84c-cgdnl" Apr 16 20:02:28.290980 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.290949 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkxkj\" (UniqueName: \"kubernetes.io/projected/4b900d2b-9062-4c36-8c87-ab82f048b6d6-kube-api-access-mkxkj\") pod \"authorino-79cbc94b89-8vc59\" (UID: \"4b900d2b-9062-4c36-8c87-ab82f048b6d6\") " pod="kuadrant-system/authorino-79cbc94b89-8vc59" Apr 16 20:02:28.297236 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.297207 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-cgdnl" Apr 16 20:02:28.299992 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.299970 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkxkj\" (UniqueName: \"kubernetes.io/projected/4b900d2b-9062-4c36-8c87-ab82f048b6d6-kube-api-access-mkxkj\") pod \"authorino-79cbc94b89-8vc59\" (UID: \"4b900d2b-9062-4c36-8c87-ab82f048b6d6\") " pod="kuadrant-system/authorino-79cbc94b89-8vc59" Apr 16 20:02:28.431200 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.431098 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-cgdnl"] Apr 16 20:02:28.433698 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:02:28.433667 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30b55c50_f06b_4a51_8de7_c24f91d4d4be.slice/crio-b69634d7d8b0d1ad126a788e4823a1c849502fff74518a4db107527fadb82c77 WatchSource:0}: Error finding container b69634d7d8b0d1ad126a788e4823a1c849502fff74518a4db107527fadb82c77: Status 404 returned error can't find the container with id b69634d7d8b0d1ad126a788e4823a1c849502fff74518a4db107527fadb82c77 Apr 16 20:02:28.476002 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.475976 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8vc59" Apr 16 20:02:28.628933 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.628821 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8vc59"] Apr 16 20:02:28.752618 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:02:28.752553 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b900d2b_9062_4c36_8c87_ab82f048b6d6.slice/crio-d0ba024271ec551ce255535140eca6aad0c63970757efc31eebc5f89f6145e5c WatchSource:0}: Error finding container d0ba024271ec551ce255535140eca6aad0c63970757efc31eebc5f89f6145e5c: Status 404 returned error can't find the container with id d0ba024271ec551ce255535140eca6aad0c63970757efc31eebc5f89f6145e5c Apr 16 20:02:28.901146 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.901090 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" event={"ID":"25186d54-ca56-4257-ae36-93c55f98e737","Type":"ContainerStarted","Data":"0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c"} Apr 16 20:02:28.901330 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.901313 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:28.902150 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.902129 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8vc59" event={"ID":"4b900d2b-9062-4c36-8c87-ab82f048b6d6","Type":"ContainerStarted","Data":"d0ba024271ec551ce255535140eca6aad0c63970757efc31eebc5f89f6145e5c"} Apr 16 20:02:28.903076 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.903053 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-cgdnl" event={"ID":"30b55c50-f06b-4a51-8de7-c24f91d4d4be","Type":"ContainerStarted","Data":"b69634d7d8b0d1ad126a788e4823a1c849502fff74518a4db107527fadb82c77"} Apr 16 20:02:28.918052 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:28.918012 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" podStartSLOduration=0.736781305 podStartE2EDuration="1.91799831s" podCreationTimestamp="2026-04-16 20:02:27 +0000 UTC" firstStartedPulling="2026-04-16 20:02:27.628178138 +0000 UTC m=+513.961559437" lastFinishedPulling="2026-04-16 20:02:28.80939514 +0000 UTC m=+515.142776442" observedRunningTime="2026-04-16 20:02:28.916662469 +0000 UTC m=+515.250043790" watchObservedRunningTime="2026-04-16 20:02:28.91799831 +0000 UTC m=+515.251379631" Apr 16 20:02:31.918645 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:31.918601 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8vc59" event={"ID":"4b900d2b-9062-4c36-8c87-ab82f048b6d6","Type":"ContainerStarted","Data":"f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f"} Apr 16 20:02:31.919871 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:31.919844 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-cgdnl" event={"ID":"30b55c50-f06b-4a51-8de7-c24f91d4d4be","Type":"ContainerStarted","Data":"db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d"} Apr 16 20:02:31.935036 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:31.934995 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-8vc59" podStartSLOduration=1.620751101 podStartE2EDuration="3.934982345s" podCreationTimestamp="2026-04-16 20:02:28 +0000 UTC" firstStartedPulling="2026-04-16 20:02:28.754036534 +0000 UTC m=+515.087417838" lastFinishedPulling="2026-04-16 20:02:31.068267763 +0000 UTC m=+517.401649082" observedRunningTime="2026-04-16 20:02:31.933462024 +0000 UTC m=+518.266843346" watchObservedRunningTime="2026-04-16 20:02:31.934982345 +0000 UTC m=+518.268363698" Apr 16 20:02:31.948724 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:31.948681 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-cgdnl" podStartSLOduration=2.322828306 podStartE2EDuration="4.948668189s" podCreationTimestamp="2026-04-16 20:02:27 +0000 UTC" firstStartedPulling="2026-04-16 20:02:28.435160065 +0000 UTC m=+514.768541380" lastFinishedPulling="2026-04-16 20:02:31.060999965 +0000 UTC m=+517.394381263" observedRunningTime="2026-04-16 20:02:31.947165943 +0000 UTC m=+518.280547265" watchObservedRunningTime="2026-04-16 20:02:31.948668189 +0000 UTC m=+518.282049509" Apr 16 20:02:31.969662 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:31.969636 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-cgdnl"] Apr 16 20:02:33.928193 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:33.928129 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-cgdnl" podUID="30b55c50-f06b-4a51-8de7-c24f91d4d4be" containerName="authorino" containerID="cri-o://db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d" gracePeriod=30 Apr 16 20:02:34.162586 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.162563 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-cgdnl" Apr 16 20:02:34.246388 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.246304 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvkc9\" (UniqueName: \"kubernetes.io/projected/30b55c50-f06b-4a51-8de7-c24f91d4d4be-kube-api-access-zvkc9\") pod \"30b55c50-f06b-4a51-8de7-c24f91d4d4be\" (UID: \"30b55c50-f06b-4a51-8de7-c24f91d4d4be\") " Apr 16 20:02:34.248440 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.248414 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b55c50-f06b-4a51-8de7-c24f91d4d4be-kube-api-access-zvkc9" (OuterVolumeSpecName: "kube-api-access-zvkc9") pod "30b55c50-f06b-4a51-8de7-c24f91d4d4be" (UID: "30b55c50-f06b-4a51-8de7-c24f91d4d4be"). InnerVolumeSpecName "kube-api-access-zvkc9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:02:34.347253 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.347219 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvkc9\" (UniqueName: \"kubernetes.io/projected/30b55c50-f06b-4a51-8de7-c24f91d4d4be-kube-api-access-zvkc9\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 20:02:34.932824 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.932795 2560 generic.go:358] "Generic (PLEG): container finished" podID="30b55c50-f06b-4a51-8de7-c24f91d4d4be" containerID="db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d" exitCode=0 Apr 16 20:02:34.932824 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.932829 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-cgdnl" event={"ID":"30b55c50-f06b-4a51-8de7-c24f91d4d4be","Type":"ContainerDied","Data":"db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d"} Apr 16 20:02:34.933288 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.932835 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-cgdnl" Apr 16 20:02:34.933288 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.932849 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-cgdnl" event={"ID":"30b55c50-f06b-4a51-8de7-c24f91d4d4be","Type":"ContainerDied","Data":"b69634d7d8b0d1ad126a788e4823a1c849502fff74518a4db107527fadb82c77"} Apr 16 20:02:34.933288 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.932870 2560 scope.go:117] "RemoveContainer" containerID="db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d" Apr 16 20:02:34.941164 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.941147 2560 scope.go:117] "RemoveContainer" containerID="db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d" Apr 16 20:02:34.941402 ip-10-0-131-77 kubenswrapper[2560]: E0416 20:02:34.941384 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d\": container with ID starting with db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d not found: ID does not exist" containerID="db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d" Apr 16 20:02:34.941446 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.941416 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d"} err="failed to get container status \"db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d\": rpc error: code = NotFound desc = could not find container \"db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d\": container with ID starting with db896e9e16e9548405f2d06566131d9b6e548688766116eb70a266ff5bb2033d not found: ID does not exist" Apr 16 20:02:34.953681 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.953657 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-cgdnl"] Apr 16 20:02:34.958713 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:34.958692 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-cgdnl"] Apr 16 20:02:36.223782 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:36.223735 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b55c50-f06b-4a51-8de7-c24f91d4d4be" path="/var/lib/kubelet/pods/30b55c50-f06b-4a51-8de7-c24f91d4d4be/volumes" Apr 16 20:02:39.910625 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:39.910597 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:43.206170 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.206086 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-t4l54"] Apr 16 20:02:43.206543 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.206344 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" podUID="25186d54-ca56-4257-ae36-93c55f98e737" containerName="limitador" containerID="cri-o://0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c" gracePeriod=30 Apr 16 20:02:43.735901 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.735879 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:43.832669 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.832601 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/25186d54-ca56-4257-ae36-93c55f98e737-config-file\") pod \"25186d54-ca56-4257-ae36-93c55f98e737\" (UID: \"25186d54-ca56-4257-ae36-93c55f98e737\") " Apr 16 20:02:43.832804 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.832671 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrvx7\" (UniqueName: \"kubernetes.io/projected/25186d54-ca56-4257-ae36-93c55f98e737-kube-api-access-wrvx7\") pod \"25186d54-ca56-4257-ae36-93c55f98e737\" (UID: \"25186d54-ca56-4257-ae36-93c55f98e737\") " Apr 16 20:02:43.832921 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.832900 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25186d54-ca56-4257-ae36-93c55f98e737-config-file" (OuterVolumeSpecName: "config-file") pod "25186d54-ca56-4257-ae36-93c55f98e737" (UID: "25186d54-ca56-4257-ae36-93c55f98e737"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:43.834569 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.834544 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25186d54-ca56-4257-ae36-93c55f98e737-kube-api-access-wrvx7" (OuterVolumeSpecName: "kube-api-access-wrvx7") pod "25186d54-ca56-4257-ae36-93c55f98e737" (UID: "25186d54-ca56-4257-ae36-93c55f98e737"). InnerVolumeSpecName "kube-api-access-wrvx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:02:43.933400 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.933361 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wrvx7\" (UniqueName: \"kubernetes.io/projected/25186d54-ca56-4257-ae36-93c55f98e737-kube-api-access-wrvx7\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 20:02:43.933400 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.933391 2560 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/25186d54-ca56-4257-ae36-93c55f98e737-config-file\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 20:02:43.966294 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.966262 2560 generic.go:358] "Generic (PLEG): container finished" podID="25186d54-ca56-4257-ae36-93c55f98e737" containerID="0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c" exitCode=0 Apr 16 20:02:43.966455 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.966325 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" event={"ID":"25186d54-ca56-4257-ae36-93c55f98e737","Type":"ContainerDied","Data":"0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c"} Apr 16 20:02:43.966455 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.966332 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" Apr 16 20:02:43.966455 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.966352 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-t4l54" event={"ID":"25186d54-ca56-4257-ae36-93c55f98e737","Type":"ContainerDied","Data":"2c592588f99780ca7304f7adf885b7a998555522631d71741829b4abce987c7c"} Apr 16 20:02:43.966455 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.966369 2560 scope.go:117] "RemoveContainer" containerID="0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c" Apr 16 20:02:43.975213 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.975194 2560 scope.go:117] "RemoveContainer" containerID="0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c" Apr 16 20:02:43.975474 ip-10-0-131-77 kubenswrapper[2560]: E0416 20:02:43.975447 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c\": container with ID starting with 0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c not found: ID does not exist" containerID="0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c" Apr 16 20:02:43.975538 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.975478 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c"} err="failed to get container status \"0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c\": rpc error: code = NotFound desc = could not find container \"0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c\": container with ID starting with 0f45172c93ecb9dd2ddadd95c76eb1cd6fc302d880c8ef526ad4722700478f9c not found: ID does not exist" Apr 16 20:02:43.988977 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.988956 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-t4l54"] Apr 16 20:02:43.993569 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:43.993547 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-t4l54"] Apr 16 20:02:44.223537 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:44.223504 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25186d54-ca56-4257-ae36-93c55f98e737" path="/var/lib/kubelet/pods/25186d54-ca56-4257-ae36-93c55f98e737/volumes" Apr 16 20:02:51.969098 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.969063 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-q4vpp"] Apr 16 20:02:51.969481 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.969417 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30b55c50-f06b-4a51-8de7-c24f91d4d4be" containerName="authorino" Apr 16 20:02:51.969481 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.969430 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b55c50-f06b-4a51-8de7-c24f91d4d4be" containerName="authorino" Apr 16 20:02:51.969481 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.969451 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25186d54-ca56-4257-ae36-93c55f98e737" containerName="limitador" Apr 16 20:02:51.969481 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.969456 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="25186d54-ca56-4257-ae36-93c55f98e737" containerName="limitador" Apr 16 20:02:51.969599 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.969509 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="30b55c50-f06b-4a51-8de7-c24f91d4d4be" containerName="authorino" Apr 16 20:02:51.969599 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.969549 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="25186d54-ca56-4257-ae36-93c55f98e737" containerName="limitador" Apr 16 20:02:51.972511 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.972495 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-q4vpp" Apr 16 20:02:51.975451 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.975432 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 20:02:51.980132 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:51.980089 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-q4vpp"] Apr 16 20:02:52.105748 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:52.105711 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxbgt\" (UniqueName: \"kubernetes.io/projected/4a90ee6d-9122-4117-8d56-b8bdcec2f22e-kube-api-access-lxbgt\") pod \"authorino-68bd676465-q4vpp\" (UID: \"4a90ee6d-9122-4117-8d56-b8bdcec2f22e\") " pod="kuadrant-system/authorino-68bd676465-q4vpp" Apr 16 20:02:52.105748 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:52.105751 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4a90ee6d-9122-4117-8d56-b8bdcec2f22e-tls-cert\") pod \"authorino-68bd676465-q4vpp\" (UID: \"4a90ee6d-9122-4117-8d56-b8bdcec2f22e\") " pod="kuadrant-system/authorino-68bd676465-q4vpp" Apr 16 20:02:52.206599 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:52.206564 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxbgt\" (UniqueName: \"kubernetes.io/projected/4a90ee6d-9122-4117-8d56-b8bdcec2f22e-kube-api-access-lxbgt\") pod \"authorino-68bd676465-q4vpp\" (UID: \"4a90ee6d-9122-4117-8d56-b8bdcec2f22e\") " pod="kuadrant-system/authorino-68bd676465-q4vpp" Apr 16 20:02:52.206599 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:52.206604 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4a90ee6d-9122-4117-8d56-b8bdcec2f22e-tls-cert\") pod \"authorino-68bd676465-q4vpp\" (UID: \"4a90ee6d-9122-4117-8d56-b8bdcec2f22e\") " pod="kuadrant-system/authorino-68bd676465-q4vpp" Apr 16 20:02:52.209029 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:52.209009 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4a90ee6d-9122-4117-8d56-b8bdcec2f22e-tls-cert\") pod \"authorino-68bd676465-q4vpp\" (UID: \"4a90ee6d-9122-4117-8d56-b8bdcec2f22e\") " pod="kuadrant-system/authorino-68bd676465-q4vpp" Apr 16 20:02:52.215038 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:52.215011 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxbgt\" (UniqueName: \"kubernetes.io/projected/4a90ee6d-9122-4117-8d56-b8bdcec2f22e-kube-api-access-lxbgt\") pod \"authorino-68bd676465-q4vpp\" (UID: \"4a90ee6d-9122-4117-8d56-b8bdcec2f22e\") " pod="kuadrant-system/authorino-68bd676465-q4vpp" Apr 16 20:02:52.282082 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:52.282012 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-q4vpp" Apr 16 20:02:52.404444 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:52.404404 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-q4vpp"] Apr 16 20:02:52.406435 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:02:52.406408 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a90ee6d_9122_4117_8d56_b8bdcec2f22e.slice/crio-e56662ee27cd1060a9cba9dd3ecb20ebfad5d29bc66808c1284e8e79abe7b961 WatchSource:0}: Error finding container e56662ee27cd1060a9cba9dd3ecb20ebfad5d29bc66808c1284e8e79abe7b961: Status 404 returned error can't find the container with id e56662ee27cd1060a9cba9dd3ecb20ebfad5d29bc66808c1284e8e79abe7b961 Apr 16 20:02:52.996860 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:52.996827 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-q4vpp" event={"ID":"4a90ee6d-9122-4117-8d56-b8bdcec2f22e","Type":"ContainerStarted","Data":"e56662ee27cd1060a9cba9dd3ecb20ebfad5d29bc66808c1284e8e79abe7b961"} Apr 16 20:02:54.001678 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:54.001640 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-q4vpp" event={"ID":"4a90ee6d-9122-4117-8d56-b8bdcec2f22e","Type":"ContainerStarted","Data":"7e45ebf1831564a5702649b70a1d7e836b9e4710e0b1e714337aa26bb4537f82"} Apr 16 20:02:54.017324 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:54.017278 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-q4vpp" podStartSLOduration=2.297505168 podStartE2EDuration="3.017263621s" podCreationTimestamp="2026-04-16 20:02:51 +0000 UTC" firstStartedPulling="2026-04-16 20:02:52.40771327 +0000 UTC m=+538.741094569" lastFinishedPulling="2026-04-16 20:02:53.127471723 +0000 UTC m=+539.460853022" observedRunningTime="2026-04-16 20:02:54.016181157 +0000 UTC m=+540.349562480" watchObservedRunningTime="2026-04-16 20:02:54.017263621 +0000 UTC m=+540.350644942" Apr 16 20:02:54.043511 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:54.043480 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8vc59"] Apr 16 20:02:54.043768 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:54.043733 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-8vc59" podUID="4b900d2b-9062-4c36-8c87-ab82f048b6d6" containerName="authorino" containerID="cri-o://f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f" gracePeriod=30 Apr 16 20:02:54.290283 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:54.290257 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8vc59" Apr 16 20:02:54.427380 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:54.427347 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkxkj\" (UniqueName: \"kubernetes.io/projected/4b900d2b-9062-4c36-8c87-ab82f048b6d6-kube-api-access-mkxkj\") pod \"4b900d2b-9062-4c36-8c87-ab82f048b6d6\" (UID: \"4b900d2b-9062-4c36-8c87-ab82f048b6d6\") " Apr 16 20:02:54.429363 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:54.429340 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b900d2b-9062-4c36-8c87-ab82f048b6d6-kube-api-access-mkxkj" (OuterVolumeSpecName: "kube-api-access-mkxkj") pod "4b900d2b-9062-4c36-8c87-ab82f048b6d6" (UID: "4b900d2b-9062-4c36-8c87-ab82f048b6d6"). InnerVolumeSpecName "kube-api-access-mkxkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:02:54.528293 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:54.528243 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mkxkj\" (UniqueName: \"kubernetes.io/projected/4b900d2b-9062-4c36-8c87-ab82f048b6d6-kube-api-access-mkxkj\") on node \"ip-10-0-131-77.ec2.internal\" DevicePath \"\"" Apr 16 20:02:55.005681 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:55.005646 2560 generic.go:358] "Generic (PLEG): container finished" podID="4b900d2b-9062-4c36-8c87-ab82f048b6d6" containerID="f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f" exitCode=0 Apr 16 20:02:55.006126 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:55.005694 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8vc59" Apr 16 20:02:55.006126 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:55.005723 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8vc59" event={"ID":"4b900d2b-9062-4c36-8c87-ab82f048b6d6","Type":"ContainerDied","Data":"f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f"} Apr 16 20:02:55.006126 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:55.005770 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8vc59" event={"ID":"4b900d2b-9062-4c36-8c87-ab82f048b6d6","Type":"ContainerDied","Data":"d0ba024271ec551ce255535140eca6aad0c63970757efc31eebc5f89f6145e5c"} Apr 16 20:02:55.006126 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:55.005795 2560 scope.go:117] "RemoveContainer" containerID="f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f" Apr 16 20:02:55.013590 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:55.013571 2560 scope.go:117] "RemoveContainer" containerID="f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f" Apr 16 20:02:55.013828 ip-10-0-131-77 kubenswrapper[2560]: E0416 20:02:55.013812 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f\": container with ID starting with f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f not found: ID does not exist" containerID="f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f" Apr 16 20:02:55.013902 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:55.013835 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f"} err="failed to get container status \"f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f\": rpc error: code = NotFound desc = could not find container \"f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f\": container with ID starting with f411936d62ce9c2c7f1644d49b89c0bda0a059478012c748764eaf88a557b83f not found: ID does not exist" Apr 16 20:02:55.028579 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:55.028429 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8vc59"] Apr 16 20:02:55.029874 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:55.029854 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8vc59"] Apr 16 20:02:56.223856 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:02:56.223823 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b900d2b-9062-4c36-8c87-ab82f048b6d6" path="/var/lib/kubelet/pods/4b900d2b-9062-4c36-8c87-ab82f048b6d6/volumes" Apr 16 20:03:02.348952 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.348920 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr"] Apr 16 20:03:02.349347 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.349286 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b900d2b-9062-4c36-8c87-ab82f048b6d6" containerName="authorino" Apr 16 20:03:02.349347 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.349298 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b900d2b-9062-4c36-8c87-ab82f048b6d6" containerName="authorino" Apr 16 20:03:02.349432 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.349358 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b900d2b-9062-4c36-8c87-ab82f048b6d6" containerName="authorino" Apr 16 20:03:02.353927 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.353911 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.356793 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.356752 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 20:03:02.356793 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.356775 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 20:03:02.356793 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.356789 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-sdrlk\"" Apr 16 20:03:02.357043 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.356816 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 20:03:02.357043 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.356833 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 20:03:02.357043 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.356789 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:03:02.357043 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.356909 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:03:02.367327 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.367304 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr"] Apr 16 20:03:02.499626 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.499594 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/75db4ec3-a7da-479e-962f-a714809041d4-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.499779 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.499635 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/75db4ec3-a7da-479e-962f-a714809041d4-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.499779 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.499676 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/75db4ec3-a7da-479e-962f-a714809041d4-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.499779 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.499747 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/75db4ec3-a7da-479e-962f-a714809041d4-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.499779 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.499776 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjx89\" (UniqueName: \"kubernetes.io/projected/75db4ec3-a7da-479e-962f-a714809041d4-kube-api-access-cjx89\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.499932 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.499825 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/75db4ec3-a7da-479e-962f-a714809041d4-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.499932 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.499846 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/75db4ec3-a7da-479e-962f-a714809041d4-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.601362 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.601278 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/75db4ec3-a7da-479e-962f-a714809041d4-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.601362 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.601315 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/75db4ec3-a7da-479e-962f-a714809041d4-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.601362 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.601343 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/75db4ec3-a7da-479e-962f-a714809041d4-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.601362 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.601362 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjx89\" (UniqueName: \"kubernetes.io/projected/75db4ec3-a7da-479e-962f-a714809041d4-kube-api-access-cjx89\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.601729 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.601413 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/75db4ec3-a7da-479e-962f-a714809041d4-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.601729 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.601442 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/75db4ec3-a7da-479e-962f-a714809041d4-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.601729 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.601509 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/75db4ec3-a7da-479e-962f-a714809041d4-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.602133 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.602083 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/75db4ec3-a7da-479e-962f-a714809041d4-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.603841 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.603811 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/75db4ec3-a7da-479e-962f-a714809041d4-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.603841 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.603818 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/75db4ec3-a7da-479e-962f-a714809041d4-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.604175 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.604160 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/75db4ec3-a7da-479e-962f-a714809041d4-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.604305 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.604282 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/75db4ec3-a7da-479e-962f-a714809041d4-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.610783 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.610757 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/75db4ec3-a7da-479e-962f-a714809041d4-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.612892 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.612868 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjx89\" (UniqueName: \"kubernetes.io/projected/75db4ec3-a7da-479e-962f-a714809041d4-kube-api-access-cjx89\") pod \"istiod-openshift-gateway-55ff986f96-4trrr\" (UID: \"75db4ec3-a7da-479e-962f-a714809041d4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.664010 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.663966 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:02.793804 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:02.793771 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr"] Apr 16 20:03:02.796052 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:03:02.796024 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75db4ec3_a7da_479e_962f_a714809041d4.slice/crio-f85b0237c0ea07af8d40fd3b12629d6c0859bfe2ca267b9bcb2d68733503688b WatchSource:0}: Error finding container f85b0237c0ea07af8d40fd3b12629d6c0859bfe2ca267b9bcb2d68733503688b: Status 404 returned error can't find the container with id f85b0237c0ea07af8d40fd3b12629d6c0859bfe2ca267b9bcb2d68733503688b Apr 16 20:03:03.034974 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:03.034937 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" event={"ID":"75db4ec3-a7da-479e-962f-a714809041d4","Type":"ContainerStarted","Data":"f85b0237c0ea07af8d40fd3b12629d6c0859bfe2ca267b9bcb2d68733503688b"} Apr 16 20:03:06.050461 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:06.050427 2560 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 20:03:06.050734 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:06.050496 2560 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 20:03:07.049403 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:07.049364 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" event={"ID":"75db4ec3-a7da-479e-962f-a714809041d4","Type":"ContainerStarted","Data":"741a3b9eda9bd2e02813297a42de308a8bf9eae636c44b09bf2053d48474334a"} Apr 16 20:03:07.049613 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:07.049581 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:03:07.051393 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:07.051366 2560 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-4trrr container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 20:03:07.051704 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:07.051410 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" podUID="75db4ec3-a7da-479e-962f-a714809041d4" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:07.089757 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:07.089705 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" podStartSLOduration=1.837337075 podStartE2EDuration="5.089690207s" podCreationTimestamp="2026-04-16 20:03:02 +0000 UTC" firstStartedPulling="2026-04-16 20:03:02.79784265 +0000 UTC m=+549.131223950" lastFinishedPulling="2026-04-16 20:03:06.05019577 +0000 UTC m=+552.383577082" observedRunningTime="2026-04-16 20:03:07.086100043 +0000 UTC m=+553.419481378" watchObservedRunningTime="2026-04-16 20:03:07.089690207 +0000 UTC m=+553.423071527" Apr 16 20:03:08.053385 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:03:08.053357 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4trrr" Apr 16 20:18:52.072355 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.072318 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-798f6c5c97-njh57"] Apr 16 20:18:52.075620 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.075604 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:18:52.079698 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.079672 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:18:52.079839 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.079703 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:18:52.079839 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.079730 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-l62p9\"" Apr 16 20:18:52.079839 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.079764 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 20:18:52.083326 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.083284 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-798f6c5c97-njh57"] Apr 16 20:18:52.117064 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.117028 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b00a6d5-ddfd-4f12-b230-bb1579a4e431-cert\") pod \"llmisvc-controller-manager-798f6c5c97-njh57\" (UID: \"3b00a6d5-ddfd-4f12-b230-bb1579a4e431\") " pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:18:52.117064 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.117069 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnl5j\" (UniqueName: \"kubernetes.io/projected/3b00a6d5-ddfd-4f12-b230-bb1579a4e431-kube-api-access-nnl5j\") pod \"llmisvc-controller-manager-798f6c5c97-njh57\" (UID: \"3b00a6d5-ddfd-4f12-b230-bb1579a4e431\") " pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:18:52.218201 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.218167 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b00a6d5-ddfd-4f12-b230-bb1579a4e431-cert\") pod \"llmisvc-controller-manager-798f6c5c97-njh57\" (UID: \"3b00a6d5-ddfd-4f12-b230-bb1579a4e431\") " pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:18:52.218370 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.218209 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnl5j\" (UniqueName: \"kubernetes.io/projected/3b00a6d5-ddfd-4f12-b230-bb1579a4e431-kube-api-access-nnl5j\") pod \"llmisvc-controller-manager-798f6c5c97-njh57\" (UID: \"3b00a6d5-ddfd-4f12-b230-bb1579a4e431\") " pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:18:52.220516 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.220492 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b00a6d5-ddfd-4f12-b230-bb1579a4e431-cert\") pod \"llmisvc-controller-manager-798f6c5c97-njh57\" (UID: \"3b00a6d5-ddfd-4f12-b230-bb1579a4e431\") " pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:18:52.229281 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.229249 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnl5j\" (UniqueName: \"kubernetes.io/projected/3b00a6d5-ddfd-4f12-b230-bb1579a4e431-kube-api-access-nnl5j\") pod \"llmisvc-controller-manager-798f6c5c97-njh57\" (UID: \"3b00a6d5-ddfd-4f12-b230-bb1579a4e431\") " pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:18:52.386829 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.386752 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:18:52.508680 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.508619 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-798f6c5c97-njh57"] Apr 16 20:18:52.510873 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:18:52.510844 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3b00a6d5_ddfd_4f12_b230_bb1579a4e431.slice/crio-e335a51aa251bb12749e87f2c2fa7428cc7df00102ed3d727e1d6f57c5e57fd6 WatchSource:0}: Error finding container e335a51aa251bb12749e87f2c2fa7428cc7df00102ed3d727e1d6f57c5e57fd6: Status 404 returned error can't find the container with id e335a51aa251bb12749e87f2c2fa7428cc7df00102ed3d727e1d6f57c5e57fd6 Apr 16 20:18:52.512086 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:52.512070 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:18:53.248472 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:53.248439 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" event={"ID":"3b00a6d5-ddfd-4f12-b230-bb1579a4e431","Type":"ContainerStarted","Data":"e335a51aa251bb12749e87f2c2fa7428cc7df00102ed3d727e1d6f57c5e57fd6"} Apr 16 20:18:56.259336 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:56.259301 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" event={"ID":"3b00a6d5-ddfd-4f12-b230-bb1579a4e431","Type":"ContainerStarted","Data":"183c06eab5cc6d2a23272f9b102edfeeada19a6c9ce1909462c245d14579d577"} Apr 16 20:18:56.259708 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:56.259444 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:18:56.276797 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:18:56.276701 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" podStartSLOduration=0.764986493 podStartE2EDuration="4.276686989s" podCreationTimestamp="2026-04-16 20:18:52 +0000 UTC" firstStartedPulling="2026-04-16 20:18:52.512224048 +0000 UTC m=+1498.845605348" lastFinishedPulling="2026-04-16 20:18:56.023924545 +0000 UTC m=+1502.357305844" observedRunningTime="2026-04-16 20:18:56.275626487 +0000 UTC m=+1502.609007810" watchObservedRunningTime="2026-04-16 20:18:56.276686989 +0000 UTC m=+1502.610068311" Apr 16 20:19:27.266942 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:19:27.266867 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-798f6c5c97-njh57" Apr 16 20:20:33.456302 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.456267 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d"] Apr 16 20:20:33.460319 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.460293 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.463139 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.463089 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:20:33.463280 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.463152 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 20:20:33.463280 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.463185 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:20:33.463481 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.463460 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-mpzpb\"" Apr 16 20:20:33.471403 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.471381 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d"] Apr 16 20:20:33.516786 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.516753 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.516914 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.516791 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.516914 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.516825 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.516914 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.516888 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/dafbea83-af49-4945-95e0-8a8b59909243-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.517029 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.516922 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.517029 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.516957 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dafbea83-af49-4945-95e0-8a8b59909243-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.517029 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.516980 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.517182 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.517028 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fknm7\" (UniqueName: \"kubernetes.io/projected/dafbea83-af49-4945-95e0-8a8b59909243-kube-api-access-fknm7\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.517182 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.517056 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dafbea83-af49-4945-95e0-8a8b59909243-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.618516 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618481 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.618687 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618520 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/dafbea83-af49-4945-95e0-8a8b59909243-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.618687 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618542 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.618687 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618664 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dafbea83-af49-4945-95e0-8a8b59909243-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.618844 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618702 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.618844 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618746 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fknm7\" (UniqueName: \"kubernetes.io/projected/dafbea83-af49-4945-95e0-8a8b59909243-kube-api-access-fknm7\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.618844 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618778 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dafbea83-af49-4945-95e0-8a8b59909243-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.618844 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618805 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.619017 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618845 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.619017 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.618861 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.619151 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.619065 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.619262 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.619246 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.619303 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.619260 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/dafbea83-af49-4945-95e0-8a8b59909243-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.619303 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.619264 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.621019 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.620996 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/dafbea83-af49-4945-95e0-8a8b59909243-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.621396 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.621380 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dafbea83-af49-4945-95e0-8a8b59909243-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.627311 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.627290 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dafbea83-af49-4945-95e0-8a8b59909243-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.627566 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.627549 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fknm7\" (UniqueName: \"kubernetes.io/projected/dafbea83-af49-4945-95e0-8a8b59909243-kube-api-access-fknm7\") pod \"router-gateway-2-openshift-default-6866b85949-5g48d\" (UID: \"dafbea83-af49-4945-95e0-8a8b59909243\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.772911 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.772823 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:33.922062 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:33.922035 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d"] Apr 16 20:20:33.923186 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:20:33.923157 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddafbea83_af49_4945_95e0_8a8b59909243.slice/crio-20836657d825a157c25bc7fa5810bf316085e7e3d056bafec3fe36927b2b5add WatchSource:0}: Error finding container 20836657d825a157c25bc7fa5810bf316085e7e3d056bafec3fe36927b2b5add: Status 404 returned error can't find the container with id 20836657d825a157c25bc7fa5810bf316085e7e3d056bafec3fe36927b2b5add Apr 16 20:20:34.592337 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:34.592298 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" event={"ID":"dafbea83-af49-4945-95e0-8a8b59909243","Type":"ContainerStarted","Data":"20836657d825a157c25bc7fa5810bf316085e7e3d056bafec3fe36927b2b5add"} Apr 16 20:20:36.451985 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:36.451951 2560 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 20:20:36.452237 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:36.452044 2560 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 20:20:36.452237 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:36.452073 2560 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 20:20:36.602193 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:36.602155 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" event={"ID":"dafbea83-af49-4945-95e0-8a8b59909243","Type":"ContainerStarted","Data":"65036cd4a4af5fddf907f91ebd838376b0a024647dba5113c86a5230f1956489"} Apr 16 20:20:36.624347 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:36.624300 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" podStartSLOduration=1.097557548 podStartE2EDuration="3.624282485s" podCreationTimestamp="2026-04-16 20:20:33 +0000 UTC" firstStartedPulling="2026-04-16 20:20:33.924983274 +0000 UTC m=+1600.258364573" lastFinishedPulling="2026-04-16 20:20:36.451708201 +0000 UTC m=+1602.785089510" observedRunningTime="2026-04-16 20:20:36.622855652 +0000 UTC m=+1602.956236974" watchObservedRunningTime="2026-04-16 20:20:36.624282485 +0000 UTC m=+1602.957663808" Apr 16 20:20:36.773429 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:36.773350 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:37.774547 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:37.774508 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" podUID="dafbea83-af49-4945-95e0-8a8b59909243" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.35:15021/healthz/ready\": dial tcp 10.133.0.35:15021: connect: connection refused" Apr 16 20:20:38.773683 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:38.773641 2560 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" podUID="dafbea83-af49-4945-95e0-8a8b59909243" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.35:15021/healthz/ready\": dial tcp 10.133.0.35:15021: connect: connection refused" Apr 16 20:20:39.777859 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:39.777825 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:39.778277 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:39.778086 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:20:39.778800 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:20:39.778785 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5g48d" Apr 16 20:26:43.370960 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:43.370875 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:44.356577 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:44.356467 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:45.341683 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:45.341655 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:46.303102 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:46.303073 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:47.260102 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:47.260060 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:48.203699 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:48.203672 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:49.157218 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:49.157187 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:50.109682 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:50.109652 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:51.082568 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:51.082529 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:52.018869 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:52.018828 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:52.940757 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:52.940730 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:53.869341 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:53.869311 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:54.831028 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:54.830999 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:55.825414 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:55.825386 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5g48d_dafbea83-af49-4945-95e0-8a8b59909243/istio-proxy/0.log" Apr 16 20:26:56.838348 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:56.838319 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4trrr_75db4ec3-a7da-479e-962f-a714809041d4/discovery/0.log" Apr 16 20:26:57.651579 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:57.651550 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4trrr_75db4ec3-a7da-479e-962f-a714809041d4/discovery/0.log" Apr 16 20:26:58.447045 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:58.447018 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-q4vpp_4a90ee6d-9122-4117-8d56-b8bdcec2f22e/authorino/0.log" Apr 16 20:26:58.477158 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:58.477139 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-rdkhj_343e342c-4b0c-47e4-8d65-800f3ca262b7/manager/0.log" Apr 16 20:26:58.490419 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:26:58.490401 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-xq84n_6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76/kuadrant-console-plugin/0.log" Apr 16 20:27:03.799581 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:03.799554 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5gp27_243af810-99f2-40cb-b920-2355426fbf4e/global-pull-secret-syncer/0.log" Apr 16 20:27:03.920944 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:03.920916 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-k8ssj_17f8be1a-9a64-4b60-a18e-62b62402d4ed/konnectivity-agent/0.log" Apr 16 20:27:03.999791 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:03.999752 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-77.ec2.internal_13eeb99f39e0cf01e19afd359b85300f/haproxy/0.log" Apr 16 20:27:08.143067 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:08.142976 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-q4vpp_4a90ee6d-9122-4117-8d56-b8bdcec2f22e/authorino/0.log" Apr 16 20:27:08.189711 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:08.189679 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-rdkhj_343e342c-4b0c-47e4-8d65-800f3ca262b7/manager/0.log" Apr 16 20:27:08.214281 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:08.214251 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-xq84n_6f1d8bbb-0d46-47c1-b6d4-d2891d4f8a76/kuadrant-console-plugin/0.log" Apr 16 20:27:09.249004 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.248975 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5811c3b-fd59-47fa-8b21-1124a15657a2/alertmanager/0.log" Apr 16 20:27:09.271015 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.270950 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5811c3b-fd59-47fa-8b21-1124a15657a2/config-reloader/0.log" Apr 16 20:27:09.298520 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.298494 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5811c3b-fd59-47fa-8b21-1124a15657a2/kube-rbac-proxy-web/0.log" Apr 16 20:27:09.319487 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.319460 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5811c3b-fd59-47fa-8b21-1124a15657a2/kube-rbac-proxy/0.log" Apr 16 20:27:09.348299 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.348274 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5811c3b-fd59-47fa-8b21-1124a15657a2/kube-rbac-proxy-metric/0.log" Apr 16 20:27:09.374683 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.374666 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5811c3b-fd59-47fa-8b21-1124a15657a2/prom-label-proxy/0.log" Apr 16 20:27:09.395777 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.395757 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5811c3b-fd59-47fa-8b21-1124a15657a2/init-config-reloader/0.log" Apr 16 20:27:09.447696 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.447674 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-hsz8z_e34fff2f-e990-435e-ada9-a8f5ac7799cc/kube-state-metrics/0.log" Apr 16 20:27:09.468282 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.468260 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-hsz8z_e34fff2f-e990-435e-ada9-a8f5ac7799cc/kube-rbac-proxy-main/0.log" Apr 16 20:27:09.488451 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.488430 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-hsz8z_e34fff2f-e990-435e-ada9-a8f5ac7799cc/kube-rbac-proxy-self/0.log" Apr 16 20:27:09.547231 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.547210 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-qbxfm_e0986774-86cf-4b7b-9210-db0b4d7d82f9/monitoring-plugin/0.log" Apr 16 20:27:09.662072 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.662042 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ff6mc_b05e9c85-30b0-49cf-a0ed-17c42870fe63/node-exporter/0.log" Apr 16 20:27:09.686186 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.686159 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ff6mc_b05e9c85-30b0-49cf-a0ed-17c42870fe63/kube-rbac-proxy/0.log" Apr 16 20:27:09.715019 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.714999 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ff6mc_b05e9c85-30b0-49cf-a0ed-17c42870fe63/init-textfile/0.log" Apr 16 20:27:09.840334 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.840258 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2tlfk_b9fec3bd-e435-4443-8f93-f18406c9bc9a/kube-rbac-proxy-main/0.log" Apr 16 20:27:09.862874 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.862849 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2tlfk_b9fec3bd-e435-4443-8f93-f18406c9bc9a/kube-rbac-proxy-self/0.log" Apr 16 20:27:09.888709 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.888691 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2tlfk_b9fec3bd-e435-4443-8f93-f18406c9bc9a/openshift-state-metrics/0.log" Apr 16 20:27:09.945347 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.945314 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf/prometheus/0.log" Apr 16 20:27:09.966785 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.966765 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf/config-reloader/0.log" Apr 16 20:27:09.991271 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:09.991246 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf/thanos-sidecar/0.log" Apr 16 20:27:10.013195 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.013168 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf/kube-rbac-proxy-web/0.log" Apr 16 20:27:10.036421 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.036402 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf/kube-rbac-proxy/0.log" Apr 16 20:27:10.058943 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.058919 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf/kube-rbac-proxy-thanos/0.log" Apr 16 20:27:10.080416 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.080396 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_eb3e1c48-92d5-41bb-92b4-7c99a14cb4bf/init-config-reloader/0.log" Apr 16 20:27:10.112877 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.112820 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-7jvps_11a83071-15af-481e-866a-601e7919b2a3/prometheus-operator/0.log" Apr 16 20:27:10.137889 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.137870 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-7jvps_11a83071-15af-481e-866a-601e7919b2a3/kube-rbac-proxy/0.log" Apr 16 20:27:10.164342 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.164321 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-tfrkd_92ba2761-16cb-4de3-9015-696acbf9c5ae/prometheus-operator-admission-webhook/0.log" Apr 16 20:27:10.286760 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.286735 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84b5f7ccf8-r95kp_b4cdbf7b-b55d-40a7-832b-619ef6754e08/thanos-query/0.log" Apr 16 20:27:10.306570 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.306547 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84b5f7ccf8-r95kp_b4cdbf7b-b55d-40a7-832b-619ef6754e08/kube-rbac-proxy-web/0.log" Apr 16 20:27:10.327953 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.327929 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84b5f7ccf8-r95kp_b4cdbf7b-b55d-40a7-832b-619ef6754e08/kube-rbac-proxy/0.log" Apr 16 20:27:10.352869 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.352847 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84b5f7ccf8-r95kp_b4cdbf7b-b55d-40a7-832b-619ef6754e08/prom-label-proxy/0.log" Apr 16 20:27:10.374381 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.374307 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84b5f7ccf8-r95kp_b4cdbf7b-b55d-40a7-832b-619ef6754e08/kube-rbac-proxy-rules/0.log" Apr 16 20:27:10.396134 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:10.396099 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84b5f7ccf8-r95kp_b4cdbf7b-b55d-40a7-832b-619ef6754e08/kube-rbac-proxy-metrics/0.log" Apr 16 20:27:12.527769 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.527738 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-5gz5z_2fcbb9c9-45d1-42ca-ad88-df53ea6940b7/download-server/0.log" Apr 16 20:27:12.794898 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.794815 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj"] Apr 16 20:27:12.798590 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.798572 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.801433 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.801411 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-plvk7\"/\"kube-root-ca.crt\"" Apr 16 20:27:12.802598 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.802577 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-plvk7\"/\"openshift-service-ca.crt\"" Apr 16 20:27:12.802705 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.802583 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-plvk7\"/\"default-dockercfg-57lmf\"" Apr 16 20:27:12.805977 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.805945 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj"] Apr 16 20:27:12.875521 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.875498 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-sys\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.875665 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.875596 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-proc\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.875665 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.875622 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-podres\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.875738 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.875689 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9h4g\" (UniqueName: \"kubernetes.io/projected/9bd48744-a5fb-4952-acaa-06f7bfa5610d-kube-api-access-w9h4g\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.875738 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.875712 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-lib-modules\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.976455 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.976425 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-proc\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.976455 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.976453 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-podres\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.976638 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.976480 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9h4g\" (UniqueName: \"kubernetes.io/projected/9bd48744-a5fb-4952-acaa-06f7bfa5610d-kube-api-access-w9h4g\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.976638 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.976498 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-lib-modules\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.976638 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.976553 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-proc\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.976638 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.976601 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-podres\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.976638 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.976603 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-lib-modules\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.976638 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.976613 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-sys\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.976851 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.976646 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9bd48744-a5fb-4952-acaa-06f7bfa5610d-sys\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:12.984576 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:12.984550 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9h4g\" (UniqueName: \"kubernetes.io/projected/9bd48744-a5fb-4952-acaa-06f7bfa5610d-kube-api-access-w9h4g\") pod \"perf-node-gather-daemonset-8q8mj\" (UID: \"9bd48744-a5fb-4952-acaa-06f7bfa5610d\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:13.109346 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.109263 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:13.232032 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.232001 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj"] Apr 16 20:27:13.234183 ip-10-0-131-77 kubenswrapper[2560]: W0416 20:27:13.234147 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9bd48744_a5fb_4952_acaa_06f7bfa5610d.slice/crio-e666d03275f8fab8ff2669579c2eabc6d582051d59c90b3ae1292111016d30fc WatchSource:0}: Error finding container e666d03275f8fab8ff2669579c2eabc6d582051d59c90b3ae1292111016d30fc: Status 404 returned error can't find the container with id e666d03275f8fab8ff2669579c2eabc6d582051d59c90b3ae1292111016d30fc Apr 16 20:27:13.235834 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.235811 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:27:13.741237 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.741212 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kbg5k_28a8b516-10dc-45a7-9ab7-f91fcd27a842/dns/0.log" Apr 16 20:27:13.760347 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.760325 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kbg5k_28a8b516-10dc-45a7-9ab7-f91fcd27a842/kube-rbac-proxy/0.log" Apr 16 20:27:13.825211 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.825183 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4ppnq_f46654e0-89ad-48e3-ae92-6dec0b5e5d80/dns-node-resolver/0.log" Apr 16 20:27:13.960379 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.960345 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" event={"ID":"9bd48744-a5fb-4952-acaa-06f7bfa5610d","Type":"ContainerStarted","Data":"0800c1923f0f82199e4504c6b388605854b67b9960770a260e486821d21af845"} Apr 16 20:27:13.960379 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.960380 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" event={"ID":"9bd48744-a5fb-4952-acaa-06f7bfa5610d","Type":"ContainerStarted","Data":"e666d03275f8fab8ff2669579c2eabc6d582051d59c90b3ae1292111016d30fc"} Apr 16 20:27:13.960589 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.960404 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:13.978093 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:13.978044 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" podStartSLOduration=1.9780305299999998 podStartE2EDuration="1.97803053s" podCreationTimestamp="2026-04-16 20:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:13.976901775 +0000 UTC m=+2000.310283108" watchObservedRunningTime="2026-04-16 20:27:13.97803053 +0000 UTC m=+2000.311411859" Apr 16 20:27:14.403218 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:14.403188 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w48w2_038b7332-e7c9-4001-a9ec-bfe5e7d7e1c8/node-ca/0.log" Apr 16 20:27:15.205401 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:15.205368 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4trrr_75db4ec3-a7da-479e-962f-a714809041d4/discovery/0.log" Apr 16 20:27:15.727451 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:15.727417 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t7r5g_8b2bac48-99a7-47ac-b46a-269204d0bfe5/serve-healthcheck-canary/0.log" Apr 16 20:27:16.304662 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:16.304632 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qxbw4_74c4c920-cd1c-4a00-acef-32d4f5377828/kube-rbac-proxy/0.log" Apr 16 20:27:16.325127 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:16.325082 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qxbw4_74c4c920-cd1c-4a00-acef-32d4f5377828/exporter/0.log" Apr 16 20:27:16.345768 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:16.345743 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qxbw4_74c4c920-cd1c-4a00-acef-32d4f5377828/extractor/0.log" Apr 16 20:27:18.884888 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:18.884857 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-ddc57ffc5-n74kb_bacd7afe-b6b7-4f06-8ece-63857ab75f80/manager/0.log" Apr 16 20:27:18.940542 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:18.940517 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-xnmhf_ca7d0703-b6b2-4c92-af9f-e3cb9bf9700a/openshift-lws-operator/0.log" Apr 16 20:27:19.579145 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:19.579100 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-798f6c5c97-njh57_3b00a6d5-ddfd-4f12-b230-bb1579a4e431/manager/0.log" Apr 16 20:27:19.974228 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:19.974203 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-8q8mj" Apr 16 20:27:26.267202 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.267171 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s45wn_2535cb5c-f95c-424b-a266-b74f5c7f4b0b/kube-multus-additional-cni-plugins/0.log" Apr 16 20:27:26.288761 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.288726 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s45wn_2535cb5c-f95c-424b-a266-b74f5c7f4b0b/egress-router-binary-copy/0.log" Apr 16 20:27:26.308728 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.308706 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s45wn_2535cb5c-f95c-424b-a266-b74f5c7f4b0b/cni-plugins/0.log" Apr 16 20:27:26.329754 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.329736 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s45wn_2535cb5c-f95c-424b-a266-b74f5c7f4b0b/bond-cni-plugin/0.log" Apr 16 20:27:26.349569 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.349546 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s45wn_2535cb5c-f95c-424b-a266-b74f5c7f4b0b/routeoverride-cni/0.log" Apr 16 20:27:26.367970 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.367953 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s45wn_2535cb5c-f95c-424b-a266-b74f5c7f4b0b/whereabouts-cni-bincopy/0.log" Apr 16 20:27:26.386592 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.386572 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s45wn_2535cb5c-f95c-424b-a266-b74f5c7f4b0b/whereabouts-cni/0.log" Apr 16 20:27:26.412651 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.412628 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dhphh_af1645bd-5537-49e3-ae71-81cf97501bb8/kube-multus/0.log" Apr 16 20:27:26.573951 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.573872 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nnh4p_e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0/network-metrics-daemon/0.log" Apr 16 20:27:26.595097 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:26.595068 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nnh4p_e6dd25f1-c4b9-4e8a-9a38-70bb41d1ded0/kube-rbac-proxy/0.log" Apr 16 20:27:27.390465 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:27.390440 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29x9d_dea9bf41-88be-4138-b6f1-4334d36c5ca3/ovn-controller/0.log" Apr 16 20:27:27.415289 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:27.415265 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29x9d_dea9bf41-88be-4138-b6f1-4334d36c5ca3/ovn-acl-logging/0.log" Apr 16 20:27:27.432283 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:27.432257 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29x9d_dea9bf41-88be-4138-b6f1-4334d36c5ca3/kube-rbac-proxy-node/0.log" Apr 16 20:27:27.451180 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:27.451157 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29x9d_dea9bf41-88be-4138-b6f1-4334d36c5ca3/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:27:27.468255 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:27.468230 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29x9d_dea9bf41-88be-4138-b6f1-4334d36c5ca3/northd/0.log" Apr 16 20:27:27.487620 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:27.487599 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29x9d_dea9bf41-88be-4138-b6f1-4334d36c5ca3/nbdb/0.log" Apr 16 20:27:27.505923 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:27.505904 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29x9d_dea9bf41-88be-4138-b6f1-4334d36c5ca3/sbdb/0.log" Apr 16 20:27:27.621322 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:27.621296 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29x9d_dea9bf41-88be-4138-b6f1-4334d36c5ca3/ovnkube-controller/0.log" Apr 16 20:27:29.306239 ip-10-0-131-77 kubenswrapper[2560]: I0416 20:27:29.306207 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vxhp2_e5be099e-d9c4-4a29-af14-f803d80a9636/network-check-target-container/0.log"