Apr 21 02:41:06.964539 ip-10-0-134-66 systemd[1]: Starting Kubernetes Kubelet... Apr 21 02:41:07.342597 ip-10-0-134-66 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 02:41:07.342597 ip-10-0-134-66 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 02:41:07.342597 ip-10-0-134-66 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 02:41:07.342597 ip-10-0-134-66 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 02:41:07.342597 ip-10-0-134-66 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 02:41:07.343756 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.343390 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 02:41:07.346542 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346527 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.346542 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346541 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346545 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346549 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346552 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346555 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346558 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346561 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346564 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346568 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346570 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346573 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346576 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346579 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346581 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346584 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346587 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346590 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346593 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346595 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346598 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.346608 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346600 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346603 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346606 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346609 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346612 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346615 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346618 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346621 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346624 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346627 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346629 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346632 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346635 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346637 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346640 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346642 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346645 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346647 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346650 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346660 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.347103 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346664 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346667 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346669 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346672 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346675 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346678 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346681 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346684 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346687 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346689 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346692 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346695 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346697 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346700 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346704 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346709 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346712 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346714 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346717 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346720 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.347607 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346722 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346725 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346728 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346730 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346733 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346735 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346738 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346740 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346742 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346745 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346747 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346757 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346760 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346762 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346765 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346768 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346770 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346773 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346775 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346778 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.348098 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346780 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346783 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346785 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346788 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.346790 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347881 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347889 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347891 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347894 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347897 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347900 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347903 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347906 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347908 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347912 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347915 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347918 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347921 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347924 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.348623 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347926 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347929 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347932 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347934 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347938 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347941 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347943 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347946 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347948 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347951 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347953 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347956 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347958 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347961 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347964 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347967 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347969 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347972 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347974 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347978 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.349091 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347981 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347984 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347986 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347989 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347991 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347994 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.347996 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348000 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348003 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348005 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348008 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348010 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348013 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348016 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348018 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348021 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348023 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348027 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348030 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348032 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.349637 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348035 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348037 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348040 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348042 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348045 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348047 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348050 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348053 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348055 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348058 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348061 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348063 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348066 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348068 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348071 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348073 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348076 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348079 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348081 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348085 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.350123 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348089 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348092 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348094 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348097 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348101 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348105 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348108 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348111 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348114 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348117 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348120 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348123 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348189 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348200 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348206 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348210 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348215 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348218 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348222 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348226 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348230 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 02:41:07.350629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348246 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348250 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348253 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348256 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348259 2572 flags.go:64] FLAG: --cgroup-root="" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348262 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348265 2572 flags.go:64] FLAG: --client-ca-file="" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348268 2572 flags.go:64] FLAG: --cloud-config="" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348271 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348273 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348278 2572 flags.go:64] FLAG: --cluster-domain="" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348281 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348284 2572 flags.go:64] FLAG: --config-dir="" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348287 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348291 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348295 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348298 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348301 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348304 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348308 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348311 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348314 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348318 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348320 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348325 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 02:41:07.351147 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348328 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348331 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348334 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348337 2572 flags.go:64] FLAG: --enable-server="true" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348340 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348344 2572 flags.go:64] FLAG: --event-burst="100" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348347 2572 flags.go:64] FLAG: --event-qps="50" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348350 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348353 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348356 2572 flags.go:64] FLAG: --eviction-hard="" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348360 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348363 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348366 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348369 2572 flags.go:64] FLAG: --eviction-soft="" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348372 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348375 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348378 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348382 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348385 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348388 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348390 2572 flags.go:64] FLAG: --feature-gates="" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348395 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348398 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348401 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348404 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348407 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 21 02:41:07.351761 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348410 2572 flags.go:64] FLAG: --help="false" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348413 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348416 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348419 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348423 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348426 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348430 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348432 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348436 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348439 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348442 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348445 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348448 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348451 2572 flags.go:64] FLAG: --kube-reserved="" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348454 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348457 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348460 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348463 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348465 2572 flags.go:64] FLAG: --lock-file="" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348468 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348472 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348475 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348480 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 02:41:07.352388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348483 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348486 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348489 2572 flags.go:64] FLAG: --logging-format="text" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348498 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348502 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348505 2572 flags.go:64] FLAG: --manifest-url="" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348508 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348512 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348515 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348519 2572 flags.go:64] FLAG: --max-pods="110" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348522 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348525 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348528 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348531 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348535 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348538 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348541 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348550 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348553 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348556 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348559 2572 flags.go:64] FLAG: --pod-cidr="" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348562 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348568 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348570 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 02:41:07.352976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348574 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348576 2572 flags.go:64] FLAG: --port="10250" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348579 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348582 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a1fb252c502721ee" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348585 2572 flags.go:64] FLAG: --qos-reserved="" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348588 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348595 2572 flags.go:64] FLAG: --register-node="true" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348598 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348601 2572 flags.go:64] FLAG: --register-with-taints="" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348604 2572 flags.go:64] FLAG: --registry-burst="10" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348607 2572 flags.go:64] FLAG: --registry-qps="5" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348610 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348613 2572 flags.go:64] FLAG: --reserved-memory="" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348617 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348620 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348623 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348626 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348629 2572 flags.go:64] FLAG: --runonce="false" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348632 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348635 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348638 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348641 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348644 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348648 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348651 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348655 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 02:41:07.353581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348657 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348661 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348663 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348666 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348669 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348672 2572 flags.go:64] FLAG: --system-cgroups="" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348675 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348680 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348683 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348686 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348690 2572 flags.go:64] FLAG: --tls-min-version="" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348693 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348697 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348700 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348703 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348706 2572 flags.go:64] FLAG: --v="2" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348710 2572 flags.go:64] FLAG: --version="false" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348714 2572 flags.go:64] FLAG: --vmodule="" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348719 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.348722 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348814 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348818 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348822 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348825 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.354192 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348827 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348830 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348833 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348835 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348838 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348841 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348846 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348849 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348852 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348854 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348857 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348859 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348862 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348865 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348867 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348870 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348873 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348875 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348878 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.354825 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348880 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348885 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348887 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348890 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348893 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348895 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348898 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348900 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348903 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348905 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348908 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348910 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348913 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348915 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348918 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348920 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348923 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348925 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348928 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.355334 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348930 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348937 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348940 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348944 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348948 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348950 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348953 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348956 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348959 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348961 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348964 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348966 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348969 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348973 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348978 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348981 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348984 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348987 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348990 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.355804 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348993 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348995 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.348998 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349000 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349003 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349006 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349008 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349011 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349014 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349016 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349019 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349021 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349024 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349026 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349031 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349034 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349036 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349039 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349041 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349044 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.356283 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349046 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349049 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349051 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349054 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.349057 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.349661 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.355858 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.355872 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355916 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355921 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355924 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355927 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355930 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355933 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355936 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355938 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.356779 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355941 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355944 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355947 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355950 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355953 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355956 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355958 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355961 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355964 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355967 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355969 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355972 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355974 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355977 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355982 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355986 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355989 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355992 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355995 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.355998 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.357172 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356000 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356003 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356006 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356010 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356012 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356015 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356017 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356020 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356023 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356025 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356028 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356031 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356033 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356035 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356038 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356040 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356043 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356045 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356048 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356051 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.357709 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356053 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356056 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356058 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356061 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356063 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356066 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356069 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356071 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356073 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356076 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356078 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356081 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356083 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356086 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356088 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356091 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356095 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356097 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356100 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356102 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.358191 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356104 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356107 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356110 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356112 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356114 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356117 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356119 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356122 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356124 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356126 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356129 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356132 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356134 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356138 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356141 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356144 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356147 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.358681 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356149 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.356154 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356276 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356282 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356285 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356288 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356291 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356293 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356296 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356299 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356302 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356305 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356308 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356311 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356315 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:41:07.359105 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356318 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356321 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356324 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356326 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356329 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356332 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356335 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356338 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356341 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356343 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356346 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356348 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356351 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356353 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356356 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356358 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356361 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356363 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356366 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:41:07.359482 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356368 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356371 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356373 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356376 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356378 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356381 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356383 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356385 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356388 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356391 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356393 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356397 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356400 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356403 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356405 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356407 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356410 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356413 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356415 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356417 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:41:07.359934 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356420 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356422 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356425 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356427 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356430 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356432 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356435 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356437 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356440 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356443 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356445 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356447 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356450 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356452 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356455 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356457 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356460 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356462 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356465 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356467 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:41:07.360446 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356470 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356472 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356475 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356477 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356480 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356483 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356485 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356488 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356490 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356493 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356495 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356498 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356500 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:07.356503 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.356507 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 02:41:07.360926 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.356602 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 02:41:07.361306 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.358592 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 02:41:07.361306 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.359503 2572 server.go:1019] "Starting client certificate rotation" Apr 21 02:41:07.361306 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.359602 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 02:41:07.361306 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.360444 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 02:41:07.382985 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.382966 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 02:41:07.385693 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.385665 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 02:41:07.398117 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.398097 2572 log.go:25] "Validated CRI v1 runtime API" Apr 21 02:41:07.402950 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.402936 2572 log.go:25] "Validated CRI v1 image API" Apr 21 02:41:07.404065 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.404048 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 02:41:07.406667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.406649 2572 fs.go:135] Filesystem UUIDs: map[02430ce4-af86-4d46-ac26-49b8d0b88165:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 dcf5d5d8-2a67-43f6-be4d-dba1e1ea519a:/dev/nvme0n1p3] Apr 21 02:41:07.406729 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.406667 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 02:41:07.412946 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.412922 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 02:41:07.413030 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.412297 2572 manager.go:217] Machine: {Timestamp:2026-04-21 02:41:07.410573575 +0000 UTC m=+0.344250654 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099689 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f72c079f7340c856f83f5de8ea833 SystemUUID:ec2f72c0-79f7-340c-856f-83f5de8ea833 BootID:adcbaa4a-bd67-4c7a-9589-22f3f7342fd6 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:eb:83:8a:f3:41 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:eb:83:8a:f3:41 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:46:f1:13:d4:37 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 02:41:07.413030 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.413005 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 02:41:07.413121 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.413093 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 02:41:07.415424 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.415399 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 02:41:07.415559 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.415425 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-66.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 02:41:07.415611 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.415569 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 02:41:07.415611 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.415577 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 02:41:07.415611 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.415589 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 02:41:07.416204 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.416194 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 02:41:07.417423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.417403 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 21 02:41:07.417530 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.417514 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 02:41:07.419410 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.419400 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 21 02:41:07.419461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.419413 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 02:41:07.419461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.419425 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 02:41:07.419461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.419434 2572 kubelet.go:397] "Adding apiserver pod source" Apr 21 02:41:07.419461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.419442 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 02:41:07.420464 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.420452 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 02:41:07.420507 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.420470 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 02:41:07.423045 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.423030 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 02:41:07.424275 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.424262 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 02:41:07.425872 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425860 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 02:41:07.425905 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425889 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 02:41:07.425905 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425899 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 02:41:07.425956 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425908 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 02:41:07.425956 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425917 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 02:41:07.425956 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425926 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 02:41:07.425956 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425940 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 02:41:07.425956 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425949 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 02:41:07.426086 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425959 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 02:41:07.426086 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425967 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 02:41:07.426086 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.425988 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 02:41:07.426086 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.426002 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 02:41:07.426706 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.426693 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 02:41:07.426742 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.426712 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 02:41:07.430137 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.430117 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-66.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 02:41:07.430246 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.430172 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 02:41:07.430246 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.430197 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-66.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 02:41:07.430343 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.430277 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 02:41:07.430343 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.430311 2572 server.go:1295] "Started kubelet" Apr 21 02:41:07.430525 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.430458 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 02:41:07.431324 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.431296 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 02:41:07.431783 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.431754 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 02:41:07.431845 ip-10-0-134-66 systemd[1]: Started Kubernetes Kubelet. Apr 21 02:41:07.432455 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.432371 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 02:41:07.433828 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.433807 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 21 02:41:07.434562 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.434540 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sscxk" Apr 21 02:41:07.437158 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.436440 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-66.ec2.internal.18a83ef33dab8874 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-66.ec2.internal,UID:ip-10-0-134-66.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-66.ec2.internal,},FirstTimestamp:2026-04-21 02:41:07.430287476 +0000 UTC m=+0.363964545,LastTimestamp:2026-04-21 02:41:07.430287476 +0000 UTC m=+0.363964545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-66.ec2.internal,}" Apr 21 02:41:07.437782 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.437763 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 02:41:07.437782 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.437775 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 02:41:07.438384 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.438368 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 02:41:07.438384 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.438369 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 02:41:07.438504 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.438393 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 02:41:07.438558 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.438494 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:07.438558 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.438522 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 21 02:41:07.438558 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.438529 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 21 02:41:07.439874 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.439854 2572 factory.go:55] Registering systemd factory Apr 21 02:41:07.439955 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.439883 2572 factory.go:223] Registration of the systemd container factory successfully Apr 21 02:41:07.440205 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.440188 2572 factory.go:153] Registering CRI-O factory Apr 21 02:41:07.440286 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.440208 2572 factory.go:223] Registration of the crio container factory successfully Apr 21 02:41:07.440336 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.440287 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 02:41:07.440336 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.440315 2572 factory.go:103] Registering Raw factory Apr 21 02:41:07.440336 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.440329 2572 manager.go:1196] Started watching for new ooms in manager Apr 21 02:41:07.440757 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.440739 2572 manager.go:319] Starting recovery of all containers Apr 21 02:41:07.442141 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.441969 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sscxk" Apr 21 02:41:07.444301 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.444272 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-66.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 02:41:07.444421 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.444399 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 02:41:07.445742 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.445704 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 02:41:07.451610 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.451465 2572 manager.go:324] Recovery completed Apr 21 02:41:07.452872 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.452853 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 21 02:41:07.455876 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.455865 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.458046 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.458030 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.458125 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.458054 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.458125 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.458063 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.458533 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.458521 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 02:41:07.458574 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.458534 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 02:41:07.458574 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.458570 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 21 02:41:07.460473 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.460461 2572 policy_none.go:49] "None policy: Start" Apr 21 02:41:07.460525 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.460478 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 02:41:07.460525 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.460488 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 21 02:41:07.508781 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.508766 2572 manager.go:341] "Starting Device Plugin manager" Apr 21 02:41:07.510070 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.508809 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 02:41:07.510070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.508822 2572 server.go:85] "Starting device plugin registration server" Apr 21 02:41:07.510070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.509054 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 02:41:07.510070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.509066 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 02:41:07.510070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.509157 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 02:41:07.510070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.509254 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 02:41:07.510070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.509263 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 02:41:07.510070 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.509897 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 02:41:07.510070 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.509935 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:07.609838 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.609776 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.610804 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.610790 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.610883 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.610817 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.610883 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.610827 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.610883 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.610853 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.615757 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.615734 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 02:41:07.616937 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.616922 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 02:41:07.617009 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.616947 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 02:41:07.617009 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.616963 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 02:41:07.617009 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.616969 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 02:41:07.617009 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.617000 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 02:41:07.617219 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.617205 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.617291 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.617245 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-66.ec2.internal\": node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:07.619820 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.619794 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:07.686143 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.686115 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:07.717223 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.717199 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal"] Apr 21 02:41:07.717308 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.717298 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.720308 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.720291 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.720416 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.720324 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.720416 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.720339 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.721634 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.721619 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.721759 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.721745 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.721806 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.721774 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.722370 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.722352 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.722370 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.722368 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.722502 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.722380 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.722502 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.722393 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.722502 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.722409 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.722502 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.722395 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.723475 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.723463 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.723531 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.723485 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:41:07.724092 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.724076 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:41:07.724150 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.724104 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:41:07.724150 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.724114 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:41:07.740082 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.740057 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.740082 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.740082 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.740203 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.740097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/19015d708a7f5256313024ebc4553800-config\") pod \"kube-apiserver-proxy-ip-10-0-134-66.ec2.internal\" (UID: \"19015d708a7f5256313024ebc4553800\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.754825 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.754805 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-66.ec2.internal\" not found" node="ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.759191 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.759176 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-66.ec2.internal\" not found" node="ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.786200 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.786179 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:07.840856 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.840829 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.840856 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.840859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.841021 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.840883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/19015d708a7f5256313024ebc4553800-config\") pod \"kube-apiserver-proxy-ip-10-0-134-66.ec2.internal\" (UID: \"19015d708a7f5256313024ebc4553800\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.841021 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.840916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.841021 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.840930 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.841021 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:07.840921 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/19015d708a7f5256313024ebc4553800-config\") pod \"kube-apiserver-proxy-ip-10-0-134-66.ec2.internal\" (UID: \"19015d708a7f5256313024ebc4553800\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 21 02:41:07.887108 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.887039 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:07.987748 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:07.987708 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:08.057196 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.057169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 21 02:41:08.062784 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.062768 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 21 02:41:08.088377 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:08.088348 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:08.188934 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:08.188847 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:08.289327 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:08.289298 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:08.359749 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.359721 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 02:41:08.360148 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.359852 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 02:41:08.390044 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:08.390013 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:08.437911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.437888 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 02:41:08.444270 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.444196 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 02:36:07 +0000 UTC" deadline="2028-02-02 17:21:58.265033568 +0000 UTC" Apr 21 02:41:08.444270 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.444219 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15662h40m49.820817693s" Apr 21 02:41:08.445792 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.445776 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 02:41:08.467434 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.467414 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ncfng" Apr 21 02:41:08.472447 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.472428 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ncfng" Apr 21 02:41:08.490449 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:08.490428 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:08.591059 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:08.591028 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:08.620580 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:08.620553 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2993d233f9a00d06d5f483ee8282a6f2.slice/crio-7571ead32f1d0f1b5586646d953f6aa79a4a8b3adab72732dcdc3128b5356fcc WatchSource:0}: Error finding container 7571ead32f1d0f1b5586646d953f6aa79a4a8b3adab72732dcdc3128b5356fcc: Status 404 returned error can't find the container with id 7571ead32f1d0f1b5586646d953f6aa79a4a8b3adab72732dcdc3128b5356fcc Apr 21 02:41:08.621018 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:08.620998 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19015d708a7f5256313024ebc4553800.slice/crio-4f932adc6b0a64758a28429220021596a493d3dfc35187b68b72ceed608bf015 WatchSource:0}: Error finding container 4f932adc6b0a64758a28429220021596a493d3dfc35187b68b72ceed608bf015: Status 404 returned error can't find the container with id 4f932adc6b0a64758a28429220021596a493d3dfc35187b68b72ceed608bf015 Apr 21 02:41:08.625168 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.625152 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:41:08.653810 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.653791 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:08.691267 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:08.691244 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 21 02:41:08.700959 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.700920 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:08.780187 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.780159 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:08.838743 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.838720 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 21 02:41:08.851372 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.851356 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 02:41:08.852383 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.852372 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 21 02:41:08.862409 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:08.862391 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 02:41:09.343932 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.343849 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:41:09.420932 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.420896 2572 apiserver.go:52] "Watching apiserver" Apr 21 02:41:09.426565 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.426531 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 02:41:09.427859 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.427726 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-2fslz","kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal","openshift-multus/multus-additional-cni-plugins-h5dpc","openshift-multus/multus-rb6n9","openshift-network-diagnostics/network-check-target-mxtcd","openshift-network-operator/iptables-alerter-nkmg8","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd","openshift-cluster-node-tuning-operator/tuned-5z2x9","openshift-dns/node-resolver-m26kz","openshift-image-registry/node-ca-vk9x8","openshift-multus/network-metrics-daemon-2f9pd","openshift-ovn-kubernetes/ovnkube-node-ckmv8","kube-system/global-pull-secret-syncer-sn49d"] Apr 21 02:41:09.430455 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.430432 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.432680 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.432653 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.432863 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.432848 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 02:41:09.433056 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.433028 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-v66n5\"" Apr 21 02:41:09.433158 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.433141 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.435333 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.435256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.437196 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.437177 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.437602 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.437585 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jht6t\"" Apr 21 02:41:09.437941 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.437924 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 02:41:09.438138 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.438115 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.438345 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.438329 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.438690 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.438669 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 02:41:09.438913 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.438893 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 02:41:09.438988 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.438938 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 02:41:09.440255 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.440008 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9jt4z\"" Apr 21 02:41:09.440393 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.440210 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 02:41:09.440393 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.440390 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.440616 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.440599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.440678 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.440642 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.442666 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.442645 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 02:41:09.442887 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.442866 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.443034 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.443014 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:09.443112 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.443080 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:09.443159 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.443142 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 02:41:09.443212 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.443168 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 02:41:09.443546 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.443320 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.443643 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.443444 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rmwfk\"" Apr 21 02:41:09.446535 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.446296 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.449056 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.449306 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.449797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-etc-openvswitch\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.449842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-node-log\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.449878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-749d4\" (UniqueName: \"kubernetes.io/projected/e533c790-5675-4c8b-bd06-8c68e6ca4ac0-kube-api-access-749d4\") pod \"iptables-alerter-nkmg8\" (UID: \"e533c790-5675-4c8b-bd06-8c68e6ca4ac0\") " pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450048 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-cni-netd\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450073 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c680a54-5b50-40b0-b0c3-514ce8751675-ovnkube-config\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450107 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fhqk2\"" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-registration-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-cni-dir\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450265 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-etc-kubernetes\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450308 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-slash\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450362 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-cni-bin\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.450461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-etc-selinux\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450509 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ssv2\" (UniqueName: \"kubernetes.io/projected/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-kube-api-access-8ssv2\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-var-lib-openvswitch\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c680a54-5b50-40b0-b0c3-514ce8751675-ovnkube-script-lib\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450923 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-socket-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.450951 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-system-cni-dir\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-socket-dir-parent\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-daemon-config\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451310 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-log-socket\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-conf-dir\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.451583 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-os-release\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-var-lib-kubelet\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451631 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmwrg\" (UniqueName: \"kubernetes.io/projected/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-kube-api-access-bmwrg\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-cnibin\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-run-k8s-cni-cncf-io\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c680a54-5b50-40b0-b0c3-514ce8751675-env-overrides\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451799 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451832 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e533c790-5675-4c8b-bd06-8c68e6ca4ac0-iptables-alerter-script\") pod \"iptables-alerter-nkmg8\" (UID: \"e533c790-5675-4c8b-bd06-8c68e6ca4ac0\") " pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e533c790-5675-4c8b-bd06-8c68e6ca4ac0-host-slash\") pod \"iptables-alerter-nkmg8\" (UID: \"e533c790-5675-4c8b-bd06-8c68e6ca4ac0\") " pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451900 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-systemd-units\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6xq\" (UniqueName: \"kubernetes.io/projected/9c680a54-5b50-40b0-b0c3-514ce8751675-kube-api-access-vx6xq\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452003 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-spxj6\"" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452010 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-cnibin\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-var-lib-cni-multus\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452078 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-system-cni-dir\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452111 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4201ca3-b268-4f22-89b7-f74f860bac2e-cni-binary-copy\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.452330 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452145 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-hostroot\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.451593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452476 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452696 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-run-openvswitch\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-sys-fs\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452904 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-run-netns\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452938 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c680a54-5b50-40b0-b0c3-514ce8751675-ovn-node-metrics-cert\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.452972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-run-ovn-kubernetes\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-os-release\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453150 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg9mc\" (UniqueName: \"kubernetes.io/projected/d4201ca3-b268-4f22-89b7-f74f860bac2e-kube-api-access-hg9mc\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-run-netns\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453257 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.453298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-run-ovn\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.453885 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-device-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.453885 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453364 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-var-lib-cni-bin\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.453885 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-run-multus-certs\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.453885 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453422 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-kubelet\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.453885 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453451 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-run-systemd\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.454193 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.453938 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.454193 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.454063 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-4wd74\"" Apr 21 02:41:09.454320 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.454297 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.455606 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.455533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:09.455696 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.455647 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:09.461856 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.461838 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.462504 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.462465 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.463978 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.463955 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.464334 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.464313 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.465496 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.464622 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-88fsd\"" Apr 21 02:41:09.465592 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.465195 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:09.465713 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.465691 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:09.465775 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.465341 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zcs4l\"" Apr 21 02:41:09.465827 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.465369 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 02:41:09.465898 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.465367 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 02:41:09.465952 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.465325 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 02:41:09.474804 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.474777 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 02:36:08 +0000 UTC" deadline="2027-09-22 12:07:49.348212253 +0000 UTC" Apr 21 02:41:09.474901 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.474805 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12465h26m39.873410282s" Apr 21 02:41:09.539697 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.539676 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 02:41:09.553646 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-run-netns\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.553646 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-sys\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.553834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/55cefcc2-8412-4791-ab29-e4fbdd117f4a-hosts-file\") pod \"node-resolver-m26kz\" (UID: \"55cefcc2-8412-4791-ab29-e4fbdd117f4a\") " pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.553834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.553834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/55cefcc2-8412-4791-ab29-e4fbdd117f4a-tmp-dir\") pod \"node-resolver-m26kz\" (UID: \"55cefcc2-8412-4791-ab29-e4fbdd117f4a\") " pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.553834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-run-ovn\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.553834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-device-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.553834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-var-lib-cni-bin\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-sysctl-d\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553858 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-tmp\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-kubelet-config\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-run-multus-certs\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-kubelet\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-run-systemd\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-etc-openvswitch\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-node-log\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.553999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:09.554047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-749d4\" (UniqueName: \"kubernetes.io/projected/e533c790-5675-4c8b-bd06-8c68e6ca4ac0-kube-api-access-749d4\") pod \"iptables-alerter-nkmg8\" (UID: \"e533c790-5675-4c8b-bd06-8c68e6ca4ac0\") " pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-cni-netd\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c680a54-5b50-40b0-b0c3-514ce8751675-ovnkube-config\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-run-netns\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-registration-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554138 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-device-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554163 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-cni-dir\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-registration-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-var-lib-cni-bin\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-etc-kubernetes\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-modprobe-d\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-run-multus-certs\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d-konnectivity-ca\") pod \"konnectivity-agent-2fslz\" (UID: \"e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d\") " pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-run-ovn\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-cni-netd\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554418 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-etc-kubernetes\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.554518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-kubelet\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554345 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-run-systemd\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-slash\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-run\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554560 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-slash\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554580 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-var-lib-kubelet\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-etc-openvswitch\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-cni-bin\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554618 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-node-log\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554646 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-cni-bin\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-etc-selinux\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ssv2\" (UniqueName: \"kubernetes.io/projected/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-kube-api-access-8ssv2\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.554983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-kubernetes\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-var-lib-openvswitch\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555118 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c680a54-5b50-40b0-b0c3-514ce8751675-ovnkube-script-lib\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.555226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-socket-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-system-cni-dir\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-socket-dir-parent\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-daemon-config\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-sysctl-conf\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555322 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555342 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtc9z\" (UniqueName: \"kubernetes.io/projected/55cefcc2-8412-4791-ab29-e4fbdd117f4a-kube-api-access-wtc9z\") pod \"node-resolver-m26kz\" (UID: \"55cefcc2-8412-4791-ab29-e4fbdd117f4a\") " pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-log-socket\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-conf-dir\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6756c\" (UniqueName: \"kubernetes.io/projected/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-kube-api-access-6756c\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-os-release\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-var-lib-kubelet\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555531 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-systemd\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgm67\" (UniqueName: \"kubernetes.io/projected/f3ca7174-0a17-4896-b723-717a079d23e3-kube-api-access-lgm67\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555599 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-dbus\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmwrg\" (UniqueName: \"kubernetes.io/projected/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-kube-api-access-bmwrg\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555668 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-cnibin\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-run-k8s-cni-cncf-io\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.555740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-etc-selinux\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556132 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c680a54-5b50-40b0-b0c3-514ce8751675-env-overrides\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556172 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-cni-dir\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-log-socket\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-conf-dir\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556423 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-os-release\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556453 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-var-lib-kubelet\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556717 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-cnibin\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.556805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556724 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c680a54-5b50-40b0-b0c3-514ce8751675-ovnkube-config\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.557266 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556815 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-system-cni-dir\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.557266 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-socket-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.557266 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-socket-dir-parent\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.557266 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556908 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-run-k8s-cni-cncf-io\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.557266 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.556954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.557266 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557020 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-var-lib-openvswitch\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.557589 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d4201ca3-b268-4f22-89b7-f74f860bac2e-multus-daemon-config\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.557589 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557453 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.557589 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.557769 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557748 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c680a54-5b50-40b0-b0c3-514ce8751675-env-overrides\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.557936 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.557936 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d-agent-certs\") pod \"konnectivity-agent-2fslz\" (UID: \"e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d\") " pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:09.557936 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e533c790-5675-4c8b-bd06-8c68e6ca4ac0-iptables-alerter-script\") pod \"iptables-alerter-nkmg8\" (UID: \"e533c790-5675-4c8b-bd06-8c68e6ca4ac0\") " pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.557936 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c680a54-5b50-40b0-b0c3-514ce8751675-ovnkube-script-lib\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.557936 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e533c790-5675-4c8b-bd06-8c68e6ca4ac0-host-slash\") pod \"iptables-alerter-nkmg8\" (UID: \"e533c790-5675-4c8b-bd06-8c68e6ca4ac0\") " pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.557936 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-systemd-units\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6xq\" (UniqueName: \"kubernetes.io/projected/9c680a54-5b50-40b0-b0c3-514ce8751675-kube-api-access-vx6xq\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.557978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-cnibin\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-var-lib-cni-multus\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-lib-modules\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17e20017-572f-4919-8639-2e7007feee0b-host\") pod \"node-ca-vk9x8\" (UID: \"17e20017-572f-4919-8639-2e7007feee0b\") " pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558112 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-system-cni-dir\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4201ca3-b268-4f22-89b7-f74f860bac2e-cni-binary-copy\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-hostroot\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-host\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.558258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-tuned\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558286 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tqfr\" (UniqueName: \"kubernetes.io/projected/17e20017-572f-4919-8639-2e7007feee0b-kube-api-access-9tqfr\") pod \"node-ca-vk9x8\" (UID: \"17e20017-572f-4919-8639-2e7007feee0b\") " pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-run-openvswitch\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-sys-fs\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-sys-fs\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558411 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-run-netns\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558499 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/17e20017-572f-4919-8639-2e7007feee0b-serviceca\") pod \"node-ca-vk9x8\" (UID: \"17e20017-572f-4919-8639-2e7007feee0b\") " pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c680a54-5b50-40b0-b0c3-514ce8751675-ovn-node-metrics-cert\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-run-ovn-kubernetes\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-os-release\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558662 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hg9mc\" (UniqueName: \"kubernetes.io/projected/d4201ca3-b268-4f22-89b7-f74f860bac2e-kube-api-access-hg9mc\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.558727 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-sysconfig\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.559394 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-system-cni-dir\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.559394 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-run-netns\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.559394 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.558895 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e533c790-5675-4c8b-bd06-8c68e6ca4ac0-host-slash\") pod \"iptables-alerter-nkmg8\" (UID: \"e533c790-5675-4c8b-bd06-8c68e6ca4ac0\") " pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.559394 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-host-run-ovn-kubernetes\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.559394 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559136 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-run-openvswitch\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.559394 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-os-release\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.559394 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.559394 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559351 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-host-var-lib-cni-multus\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.559744 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-cnibin\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.559744 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559446 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d4201ca3-b268-4f22-89b7-f74f860bac2e-hostroot\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.559744 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559498 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e533c790-5675-4c8b-bd06-8c68e6ca4ac0-iptables-alerter-script\") pod \"iptables-alerter-nkmg8\" (UID: \"e533c790-5675-4c8b-bd06-8c68e6ca4ac0\") " pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.559744 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559524 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c680a54-5b50-40b0-b0c3-514ce8751675-systemd-units\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.560179 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.559955 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4201ca3-b268-4f22-89b7-f74f860bac2e-cni-binary-copy\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.560344 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.560317 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 02:41:09.564621 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.564561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c680a54-5b50-40b0-b0c3-514ce8751675-ovn-node-metrics-cert\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.564751 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.564655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-749d4\" (UniqueName: \"kubernetes.io/projected/e533c790-5675-4c8b-bd06-8c68e6ca4ac0-kube-api-access-749d4\") pod \"iptables-alerter-nkmg8\" (UID: \"e533c790-5675-4c8b-bd06-8c68e6ca4ac0\") " pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.564819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.564767 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ssv2\" (UniqueName: \"kubernetes.io/projected/32453d3a-63ed-4e6e-b2c0-47e7ad529f69-kube-api-access-8ssv2\") pod \"aws-ebs-csi-driver-node-jzpvd\" (UID: \"32453d3a-63ed-4e6e-b2c0-47e7ad529f69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.566030 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.566009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmwrg\" (UniqueName: \"kubernetes.io/projected/ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f-kube-api-access-bmwrg\") pod \"multus-additional-cni-plugins-h5dpc\" (UID: \"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f\") " pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.566217 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.566196 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg9mc\" (UniqueName: \"kubernetes.io/projected/d4201ca3-b268-4f22-89b7-f74f860bac2e-kube-api-access-hg9mc\") pod \"multus-rb6n9\" (UID: \"d4201ca3-b268-4f22-89b7-f74f860bac2e\") " pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.566656 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.566636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6xq\" (UniqueName: \"kubernetes.io/projected/9c680a54-5b50-40b0-b0c3-514ce8751675-kube-api-access-vx6xq\") pod \"ovnkube-node-ckmv8\" (UID: \"9c680a54-5b50-40b0-b0c3-514ce8751675\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.621197 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.621103 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" event={"ID":"2993d233f9a00d06d5f483ee8282a6f2","Type":"ContainerStarted","Data":"7571ead32f1d0f1b5586646d953f6aa79a4a8b3adab72732dcdc3128b5356fcc"} Apr 21 02:41:09.622097 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.622069 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" event={"ID":"19015d708a7f5256313024ebc4553800","Type":"ContainerStarted","Data":"4f932adc6b0a64758a28429220021596a493d3dfc35187b68b72ceed608bf015"} Apr 21 02:41:09.659817 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-sysctl-conf\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.659817 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtc9z\" (UniqueName: \"kubernetes.io/projected/55cefcc2-8412-4791-ab29-e4fbdd117f4a-kube-api-access-wtc9z\") pod \"node-resolver-m26kz\" (UID: \"55cefcc2-8412-4791-ab29-e4fbdd117f4a\") " pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6756c\" (UniqueName: \"kubernetes.io/projected/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-kube-api-access-6756c\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659898 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-systemd\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659920 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgm67\" (UniqueName: \"kubernetes.io/projected/f3ca7174-0a17-4896-b723-717a079d23e3-kube-api-access-lgm67\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-dbus\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d-agent-certs\") pod \"konnectivity-agent-2fslz\" (UID: \"e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d\") " pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.659971 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-sysctl-conf\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-systemd\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.659995 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-lib-modules\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.660071 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs podName:f3ca7174-0a17-4896-b723-717a079d23e3 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:10.160034438 +0000 UTC m=+3.093711509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs") pod "network-metrics-daemon-2f9pd" (UID: "f3ca7174-0a17-4896-b723-717a079d23e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17e20017-572f-4919-8639-2e7007feee0b-host\") pod \"node-ca-vk9x8\" (UID: \"17e20017-572f-4919-8639-2e7007feee0b\") " pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-host\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-dbus\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-tuned\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660123 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-lib-modules\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tqfr\" (UniqueName: \"kubernetes.io/projected/17e20017-572f-4919-8639-2e7007feee0b-kube-api-access-9tqfr\") pod \"node-ca-vk9x8\" (UID: \"17e20017-572f-4919-8639-2e7007feee0b\") " pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660186 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-host\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/17e20017-572f-4919-8639-2e7007feee0b-serviceca\") pod \"node-ca-vk9x8\" (UID: \"17e20017-572f-4919-8639-2e7007feee0b\") " pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660192 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17e20017-572f-4919-8639-2e7007feee0b-host\") pod \"node-ca-vk9x8\" (UID: \"17e20017-572f-4919-8639-2e7007feee0b\") " pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.660313 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.660368 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret podName:8d474c84-b62f-4695-9dda-3d8d9e6aacb7 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:10.160349454 +0000 UTC m=+3.094026508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret") pod "global-pull-secret-syncer-sn49d" (UID: "8d474c84-b62f-4695-9dda-3d8d9e6aacb7") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-sysconfig\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-sys\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/55cefcc2-8412-4791-ab29-e4fbdd117f4a-hosts-file\") pod \"node-resolver-m26kz\" (UID: \"55cefcc2-8412-4791-ab29-e4fbdd117f4a\") " pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.660578 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/55cefcc2-8412-4791-ab29-e4fbdd117f4a-tmp-dir\") pod \"node-resolver-m26kz\" (UID: \"55cefcc2-8412-4791-ab29-e4fbdd117f4a\") " pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660490 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-sys\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-sysctl-d\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-sysconfig\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-tmp\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-kubelet-config\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660585 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/55cefcc2-8412-4791-ab29-e4fbdd117f4a-hosts-file\") pod \"node-resolver-m26kz\" (UID: \"55cefcc2-8412-4791-ab29-e4fbdd117f4a\") " pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660631 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-modprobe-d\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-sysctl-d\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660664 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-kubelet-config\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d-konnectivity-ca\") pod \"konnectivity-agent-2fslz\" (UID: \"e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d\") " pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660695 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/17e20017-572f-4919-8639-2e7007feee0b-serviceca\") pod \"node-ca-vk9x8\" (UID: \"17e20017-572f-4919-8639-2e7007feee0b\") " pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-run\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660734 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-var-lib-kubelet\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-kubernetes\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-modprobe-d\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/55cefcc2-8412-4791-ab29-e4fbdd117f4a-tmp-dir\") pod \"node-resolver-m26kz\" (UID: \"55cefcc2-8412-4791-ab29-e4fbdd117f4a\") " pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.661385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-run\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.662273 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660837 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-kubernetes\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.662273 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.660869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-var-lib-kubelet\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.662273 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.661122 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d-konnectivity-ca\") pod \"konnectivity-agent-2fslz\" (UID: \"e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d\") " pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:09.662885 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.662864 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-tmp\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.663054 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.663036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-etc-tuned\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.663273 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.663252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d-agent-certs\") pod \"konnectivity-agent-2fslz\" (UID: \"e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d\") " pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:09.667326 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.666322 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:09.667326 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.666343 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:09.667326 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.666356 2572 projected.go:194] Error preparing data for projected volume kube-api-access-crsqc for pod openshift-network-diagnostics/network-check-target-mxtcd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:09.667326 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:09.666416 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc podName:8c974cfd-58e5-4552-b37f-4c663e11283e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:10.166395985 +0000 UTC m=+3.100073059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-crsqc" (UniqueName: "kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc") pod "network-check-target-mxtcd" (UID: "8c974cfd-58e5-4552-b37f-4c663e11283e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:09.668255 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.668223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgm67\" (UniqueName: \"kubernetes.io/projected/f3ca7174-0a17-4896-b723-717a079d23e3-kube-api-access-lgm67\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:09.668355 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.668331 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6756c\" (UniqueName: \"kubernetes.io/projected/09c18ab9-7867-4a5c-8b6d-a40d3e898daa-kube-api-access-6756c\") pod \"tuned-5z2x9\" (UID: \"09c18ab9-7867-4a5c-8b6d-a40d3e898daa\") " pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.668629 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.668599 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtc9z\" (UniqueName: \"kubernetes.io/projected/55cefcc2-8412-4791-ab29-e4fbdd117f4a-kube-api-access-wtc9z\") pod \"node-resolver-m26kz\" (UID: \"55cefcc2-8412-4791-ab29-e4fbdd117f4a\") " pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.669548 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.669530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tqfr\" (UniqueName: \"kubernetes.io/projected/17e20017-572f-4919-8639-2e7007feee0b-kube-api-access-9tqfr\") pod \"node-ca-vk9x8\" (UID: \"17e20017-572f-4919-8639-2e7007feee0b\") " pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:09.743826 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.743785 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nkmg8" Apr 21 02:41:09.757819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.757790 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:09.770604 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.770581 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" Apr 21 02:41:09.783373 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.783346 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" Apr 21 02:41:09.793051 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.793026 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rb6n9" Apr 21 02:41:09.799689 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.799668 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:09.806335 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.806310 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" Apr 21 02:41:09.813837 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.813820 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m26kz" Apr 21 02:41:09.820374 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:09.820355 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vk9x8" Apr 21 02:41:10.164248 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.164194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:10.164427 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.164279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:10.164427 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:10.164363 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:10.164537 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:10.164437 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs podName:f3ca7174-0a17-4896-b723-717a079d23e3 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:11.164417006 +0000 UTC m=+4.098094063 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs") pod "network-metrics-daemon-2f9pd" (UID: "f3ca7174-0a17-4896-b723-717a079d23e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:10.164537 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:10.164374 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:10.164537 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:10.164503 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret podName:8d474c84-b62f-4695-9dda-3d8d9e6aacb7 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:11.164486859 +0000 UTC m=+4.098163928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret") pod "global-pull-secret-syncer-sn49d" (UID: "8d474c84-b62f-4695-9dda-3d8d9e6aacb7") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:10.264915 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.264879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:10.265062 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:10.265013 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:10.265062 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:10.265036 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:10.265062 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:10.265047 2572 projected.go:194] Error preparing data for projected volume kube-api-access-crsqc for pod openshift-network-diagnostics/network-check-target-mxtcd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:10.265159 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:10.265101 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc podName:8c974cfd-58e5-4552-b37f-4c663e11283e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:11.265084619 +0000 UTC m=+4.198761679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-crsqc" (UniqueName: "kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc") pod "network-check-target-mxtcd" (UID: "8c974cfd-58e5-4552-b37f-4c663e11283e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:10.288587 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:10.288554 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c680a54_5b50_40b0_b0c3_514ce8751675.slice/crio-9e6ac29646cf000aee08c22e81a7c06911c23bf918ff505fff8ccbd8c31cc93a WatchSource:0}: Error finding container 9e6ac29646cf000aee08c22e81a7c06911c23bf918ff505fff8ccbd8c31cc93a: Status 404 returned error can't find the container with id 9e6ac29646cf000aee08c22e81a7c06911c23bf918ff505fff8ccbd8c31cc93a Apr 21 02:41:10.290682 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:10.290538 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09c18ab9_7867_4a5c_8b6d_a40d3e898daa.slice/crio-8a60bbcc04dcb3004e9324361ab988561fe33fdf06512703fd9716da926d8af5 WatchSource:0}: Error finding container 8a60bbcc04dcb3004e9324361ab988561fe33fdf06512703fd9716da926d8af5: Status 404 returned error can't find the container with id 8a60bbcc04dcb3004e9324361ab988561fe33fdf06512703fd9716da926d8af5 Apr 21 02:41:10.291731 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:10.291629 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode34dd0ac_0e5c_4910_bc4e_288c8fff8b5d.slice/crio-d2e5f2137a5b5b473e12588a95b6ffe10c0a2b5656bdf23dd0cd506c5af7653d WatchSource:0}: Error finding container d2e5f2137a5b5b473e12588a95b6ffe10c0a2b5656bdf23dd0cd506c5af7653d: Status 404 returned error can't find the container with id d2e5f2137a5b5b473e12588a95b6ffe10c0a2b5656bdf23dd0cd506c5af7653d Apr 21 02:41:10.294360 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:10.294318 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55cefcc2_8412_4791_ab29_e4fbdd117f4a.slice/crio-1b461c629918a04bc6c9040f86e20bcb8da93a630fe8a8c24dd725984a4967be WatchSource:0}: Error finding container 1b461c629918a04bc6c9040f86e20bcb8da93a630fe8a8c24dd725984a4967be: Status 404 returned error can't find the container with id 1b461c629918a04bc6c9040f86e20bcb8da93a630fe8a8c24dd725984a4967be Apr 21 02:41:10.294958 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:10.294933 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode533c790_5675_4c8b_bd06_8c68e6ca4ac0.slice/crio-fa93d257c3bad756fabe092eb136f5e293ef707b2a10dc12eab200a6982580b3 WatchSource:0}: Error finding container fa93d257c3bad756fabe092eb136f5e293ef707b2a10dc12eab200a6982580b3: Status 404 returned error can't find the container with id fa93d257c3bad756fabe092eb136f5e293ef707b2a10dc12eab200a6982580b3 Apr 21 02:41:10.315522 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:10.315406 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4201ca3_b268_4f22_89b7_f74f860bac2e.slice/crio-493e1988c705b6e5c44419f45e8f392f8111668e77779abaa2f822d37b4af61f WatchSource:0}: Error finding container 493e1988c705b6e5c44419f45e8f392f8111668e77779abaa2f822d37b4af61f: Status 404 returned error can't find the container with id 493e1988c705b6e5c44419f45e8f392f8111668e77779abaa2f822d37b4af61f Apr 21 02:41:10.316211 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:10.316191 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17e20017_572f_4919_8639_2e7007feee0b.slice/crio-9b0ca963afb389440935dcb71c2553e58a34614065557171c8a3311e27724669 WatchSource:0}: Error finding container 9b0ca963afb389440935dcb71c2553e58a34614065557171c8a3311e27724669: Status 404 returned error can't find the container with id 9b0ca963afb389440935dcb71c2553e58a34614065557171c8a3311e27724669 Apr 21 02:41:10.317382 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:10.317361 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32453d3a_63ed_4e6e_b2c0_47e7ad529f69.slice/crio-cc66af05a3135f70fc501b69cb3f7732bbb913b884a7b7f8a1f3ef6ec56002f8 WatchSource:0}: Error finding container cc66af05a3135f70fc501b69cb3f7732bbb913b884a7b7f8a1f3ef6ec56002f8: Status 404 returned error can't find the container with id cc66af05a3135f70fc501b69cb3f7732bbb913b884a7b7f8a1f3ef6ec56002f8 Apr 21 02:41:10.318101 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:41:10.318079 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee8dd925_7241_4c8b_a7b6_bcbf8bab3a8f.slice/crio-11dd7f04ea4551b62b5c48e238656d739fcf4fde1417cc67c654243309213573 WatchSource:0}: Error finding container 11dd7f04ea4551b62b5c48e238656d739fcf4fde1417cc67c654243309213573: Status 404 returned error can't find the container with id 11dd7f04ea4551b62b5c48e238656d739fcf4fde1417cc67c654243309213573 Apr 21 02:41:10.475857 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.475544 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 02:36:08 +0000 UTC" deadline="2027-10-27 17:55:53.129976348 +0000 UTC" Apr 21 02:41:10.475857 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.475787 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13311h14m42.654193667s" Apr 21 02:41:10.623953 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.623909 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vk9x8" event={"ID":"17e20017-572f-4919-8639-2e7007feee0b","Type":"ContainerStarted","Data":"9b0ca963afb389440935dcb71c2553e58a34614065557171c8a3311e27724669"} Apr 21 02:41:10.624790 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.624765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nkmg8" event={"ID":"e533c790-5675-4c8b-bd06-8c68e6ca4ac0","Type":"ContainerStarted","Data":"fa93d257c3bad756fabe092eb136f5e293ef707b2a10dc12eab200a6982580b3"} Apr 21 02:41:10.625786 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.625755 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m26kz" event={"ID":"55cefcc2-8412-4791-ab29-e4fbdd117f4a","Type":"ContainerStarted","Data":"1b461c629918a04bc6c9040f86e20bcb8da93a630fe8a8c24dd725984a4967be"} Apr 21 02:41:10.627593 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.627563 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2fslz" event={"ID":"e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d","Type":"ContainerStarted","Data":"d2e5f2137a5b5b473e12588a95b6ffe10c0a2b5656bdf23dd0cd506c5af7653d"} Apr 21 02:41:10.628932 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.628892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" event={"ID":"9c680a54-5b50-40b0-b0c3-514ce8751675","Type":"ContainerStarted","Data":"9e6ac29646cf000aee08c22e81a7c06911c23bf918ff505fff8ccbd8c31cc93a"} Apr 21 02:41:10.630217 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.630185 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" event={"ID":"32453d3a-63ed-4e6e-b2c0-47e7ad529f69","Type":"ContainerStarted","Data":"cc66af05a3135f70fc501b69cb3f7732bbb913b884a7b7f8a1f3ef6ec56002f8"} Apr 21 02:41:10.631209 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.631188 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rb6n9" event={"ID":"d4201ca3-b268-4f22-89b7-f74f860bac2e","Type":"ContainerStarted","Data":"493e1988c705b6e5c44419f45e8f392f8111668e77779abaa2f822d37b4af61f"} Apr 21 02:41:10.632228 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.632206 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" event={"ID":"09c18ab9-7867-4a5c-8b6d-a40d3e898daa","Type":"ContainerStarted","Data":"8a60bbcc04dcb3004e9324361ab988561fe33fdf06512703fd9716da926d8af5"} Apr 21 02:41:10.633888 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.633866 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" event={"ID":"19015d708a7f5256313024ebc4553800","Type":"ContainerStarted","Data":"46b4349e0cfbfe14d6b35aabc5efdc121cb3170e5dde85ead89c681e644875f3"} Apr 21 02:41:10.635007 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.634987 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" event={"ID":"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f","Type":"ContainerStarted","Data":"11dd7f04ea4551b62b5c48e238656d739fcf4fde1417cc67c654243309213573"} Apr 21 02:41:10.647850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:10.647798 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" podStartSLOduration=2.647786795 podStartE2EDuration="2.647786795s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:41:10.646366663 +0000 UTC m=+3.580043739" watchObservedRunningTime="2026-04-21 02:41:10.647786795 +0000 UTC m=+3.581463862" Apr 21 02:41:11.171893 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:11.171854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:11.172075 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:11.171930 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:11.172145 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.172076 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:11.172145 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.172139 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret podName:8d474c84-b62f-4695-9dda-3d8d9e6aacb7 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:13.172120986 +0000 UTC m=+6.105798045 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret") pod "global-pull-secret-syncer-sn49d" (UID: "8d474c84-b62f-4695-9dda-3d8d9e6aacb7") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:11.172540 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.172520 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:11.172636 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.172571 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs podName:f3ca7174-0a17-4896-b723-717a079d23e3 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:13.172557055 +0000 UTC m=+6.106234108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs") pod "network-metrics-daemon-2f9pd" (UID: "f3ca7174-0a17-4896-b723-717a079d23e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:11.273324 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:11.273227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:11.273485 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.273443 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:11.273485 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.273462 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:11.273485 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.273475 2572 projected.go:194] Error preparing data for projected volume kube-api-access-crsqc for pod openshift-network-diagnostics/network-check-target-mxtcd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:11.273655 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.273532 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc podName:8c974cfd-58e5-4552-b37f-4c663e11283e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:13.273513461 +0000 UTC m=+6.207190532 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-crsqc" (UniqueName: "kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc") pod "network-check-target-mxtcd" (UID: "8c974cfd-58e5-4552-b37f-4c663e11283e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:11.622504 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:11.622002 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:11.622504 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.622116 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:11.622504 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:11.622422 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:11.623043 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.622533 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:11.623043 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:11.622580 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:11.623043 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:11.622664 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:11.654138 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:11.653620 2572 generic.go:358] "Generic (PLEG): container finished" podID="2993d233f9a00d06d5f483ee8282a6f2" containerID="15e41a5e84edc6eb9aba8c0ace4568f788325837628db0de8319601ea26994d9" exitCode=0 Apr 21 02:41:11.654138 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:11.653766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" event={"ID":"2993d233f9a00d06d5f483ee8282a6f2","Type":"ContainerDied","Data":"15e41a5e84edc6eb9aba8c0ace4568f788325837628db0de8319601ea26994d9"} Apr 21 02:41:12.659284 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:12.659228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" event={"ID":"2993d233f9a00d06d5f483ee8282a6f2","Type":"ContainerStarted","Data":"39a5353dddaa66c7818032faccde8f5e6c9c9cc3585717ca7d61a9c17168f3cd"} Apr 21 02:41:12.684323 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:12.684152 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" podStartSLOduration=4.684134214 podStartE2EDuration="4.684134214s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:41:12.683780734 +0000 UTC m=+5.617457812" watchObservedRunningTime="2026-04-21 02:41:12.684134214 +0000 UTC m=+5.617811290" Apr 21 02:41:13.191475 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:13.191440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:13.191660 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:13.191506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:13.191660 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.191651 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:13.191761 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.191708 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret podName:8d474c84-b62f-4695-9dda-3d8d9e6aacb7 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:17.191689592 +0000 UTC m=+10.125366662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret") pod "global-pull-secret-syncer-sn49d" (UID: "8d474c84-b62f-4695-9dda-3d8d9e6aacb7") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:13.192120 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.191932 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:13.192120 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.192000 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs podName:f3ca7174-0a17-4896-b723-717a079d23e3 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:17.191982768 +0000 UTC m=+10.125659841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs") pod "network-metrics-daemon-2f9pd" (UID: "f3ca7174-0a17-4896-b723-717a079d23e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:13.292834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:13.292792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:13.293010 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.292971 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:13.293010 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.292992 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:13.293010 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.293006 2572 projected.go:194] Error preparing data for projected volume kube-api-access-crsqc for pod openshift-network-diagnostics/network-check-target-mxtcd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:13.293174 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.293072 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc podName:8c974cfd-58e5-4552-b37f-4c663e11283e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:17.293052593 +0000 UTC m=+10.226729651 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-crsqc" (UniqueName: "kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc") pod "network-check-target-mxtcd" (UID: "8c974cfd-58e5-4552-b37f-4c663e11283e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:13.617901 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:13.617820 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:13.618057 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.617968 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:13.618115 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:13.618060 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:13.618175 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.618156 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:13.618175 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:13.618165 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:13.618305 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:13.618282 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:15.617445 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:15.617400 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:15.617914 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:15.617545 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:15.617989 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:15.617936 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:15.617989 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:15.617960 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:15.618087 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:15.618031 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:15.618140 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:15.618115 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:17.227308 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:17.227269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:17.227756 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:17.227347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:17.227756 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.227511 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:17.227756 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.227578 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret podName:8d474c84-b62f-4695-9dda-3d8d9e6aacb7 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:25.227557254 +0000 UTC m=+18.161234329 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret") pod "global-pull-secret-syncer-sn49d" (UID: "8d474c84-b62f-4695-9dda-3d8d9e6aacb7") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:17.228025 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.228006 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:17.228090 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.228066 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs podName:f3ca7174-0a17-4896-b723-717a079d23e3 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:25.228049052 +0000 UTC m=+18.161726129 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs") pod "network-metrics-daemon-2f9pd" (UID: "f3ca7174-0a17-4896-b723-717a079d23e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:17.328218 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:17.328181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:17.328390 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.328359 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:17.328390 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.328379 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:17.328390 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.328392 2572 projected.go:194] Error preparing data for projected volume kube-api-access-crsqc for pod openshift-network-diagnostics/network-check-target-mxtcd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:17.328536 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.328448 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc podName:8c974cfd-58e5-4552-b37f-4c663e11283e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:25.328428526 +0000 UTC m=+18.262105601 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-crsqc" (UniqueName: "kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc") pod "network-check-target-mxtcd" (UID: "8c974cfd-58e5-4552-b37f-4c663e11283e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:17.618562 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:17.618485 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:17.618716 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.618601 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:17.618944 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:17.618929 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:17.619058 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.619038 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:17.619130 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:17.619085 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:17.619183 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:17.619165 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:19.617528 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:19.617433 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:19.617528 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:19.617471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:19.617996 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:19.617434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:19.617996 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:19.617585 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:19.617996 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:19.617700 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:19.617996 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:19.617761 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:21.617696 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:21.617654 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:21.618133 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:21.617705 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:21.618133 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:21.617764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:21.618133 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:21.617871 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:21.618133 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:21.617998 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:21.618133 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:21.618097 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:23.617216 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:23.617182 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:23.617647 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:23.617328 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:23.617647 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:23.617359 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:23.617647 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:23.617541 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:23.617647 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:23.617570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:23.617647 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:23.617635 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:25.284070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:25.284030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:25.284576 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:25.284094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:25.284576 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.284188 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:25.284576 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.284207 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:25.284576 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.284273 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs podName:f3ca7174-0a17-4896-b723-717a079d23e3 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:41.284251752 +0000 UTC m=+34.217928808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs") pod "network-metrics-daemon-2f9pd" (UID: "f3ca7174-0a17-4896-b723-717a079d23e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:25.284576 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.284290 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret podName:8d474c84-b62f-4695-9dda-3d8d9e6aacb7 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:41.284282943 +0000 UTC m=+34.217959996 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret") pod "global-pull-secret-syncer-sn49d" (UID: "8d474c84-b62f-4695-9dda-3d8d9e6aacb7") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:25.385192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:25.385158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:25.385376 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.385321 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:25.385376 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.385344 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:25.385376 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.385356 2572 projected.go:194] Error preparing data for projected volume kube-api-access-crsqc for pod openshift-network-diagnostics/network-check-target-mxtcd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:25.385513 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.385411 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc podName:8c974cfd-58e5-4552-b37f-4c663e11283e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:41.385397074 +0000 UTC m=+34.319074127 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-crsqc" (UniqueName: "kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc") pod "network-check-target-mxtcd" (UID: "8c974cfd-58e5-4552-b37f-4c663e11283e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:25.618268 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:25.618171 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:25.618268 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:25.618171 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:25.618512 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:25.618171 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:25.618512 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.618305 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:25.618512 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.618438 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:25.618644 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:25.618526 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:27.618590 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.618117 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:27.619205 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.618191 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:27.619205 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:27.618698 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:27.619205 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.618212 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:27.619205 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:27.618802 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:27.619205 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:27.618862 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:27.692048 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.692015 2572 generic.go:358] "Generic (PLEG): container finished" podID="ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f" containerID="3aa0669c0b6766dd1c8593e613bd167f271ca8de2fcac07114d363f3c28ffa42" exitCode=0 Apr 21 02:41:27.692158 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.692116 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" event={"ID":"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f","Type":"ContainerDied","Data":"3aa0669c0b6766dd1c8593e613bd167f271ca8de2fcac07114d363f3c28ffa42"} Apr 21 02:41:27.693415 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.693388 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vk9x8" event={"ID":"17e20017-572f-4919-8639-2e7007feee0b","Type":"ContainerStarted","Data":"82c1713983fa16565297496d7da043c75454621be4549b3277411fa27cc8fad6"} Apr 21 02:41:27.694720 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.694696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m26kz" event={"ID":"55cefcc2-8412-4791-ab29-e4fbdd117f4a","Type":"ContainerStarted","Data":"6ee64515d2b4c51f3111e06efe1a5b5452a93934ee80373387d4ffc4ac975466"} Apr 21 02:41:27.696115 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.696097 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2fslz" event={"ID":"e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d","Type":"ContainerStarted","Data":"ea3789a74666d0d3922d6fd366413332f159fa7b0c1215218e1829af22ffaeb9"} Apr 21 02:41:27.701585 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.701564 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" event={"ID":"9c680a54-5b50-40b0-b0c3-514ce8751675","Type":"ContainerStarted","Data":"7056781887f4faf13a71f510fedea87a701afb6c6e29a28496ea6035a23142ce"} Apr 21 02:41:27.701585 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.701590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" event={"ID":"9c680a54-5b50-40b0-b0c3-514ce8751675","Type":"ContainerStarted","Data":"466808aecc9bc17d11900227ec30a71b3e2b936b3b73bbedeb57dbe6241c4334"} Apr 21 02:41:27.701716 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.701604 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" event={"ID":"9c680a54-5b50-40b0-b0c3-514ce8751675","Type":"ContainerStarted","Data":"7a3b699ac0038b88ae41ec2b43e840624a7494d9bd19ace85545eeee07257e1b"} Apr 21 02:41:27.701716 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.701616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" event={"ID":"9c680a54-5b50-40b0-b0c3-514ce8751675","Type":"ContainerStarted","Data":"52d042e36e3147cf0500784fb998af5c0c2d9780e9e1bcddb3d329dc7ef36f60"} Apr 21 02:41:27.701716 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.701627 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" event={"ID":"9c680a54-5b50-40b0-b0c3-514ce8751675","Type":"ContainerStarted","Data":"9dea876501052350153fe5876bed78a70ad898295f58d57061fb5bae12b0122f"} Apr 21 02:41:27.702962 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.702941 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" event={"ID":"32453d3a-63ed-4e6e-b2c0-47e7ad529f69","Type":"ContainerStarted","Data":"9f6f8699da907f4358ae5df50c5485004fe9fd4ca0a2028a0f357637fa05448a"} Apr 21 02:41:27.704572 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.704548 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rb6n9" event={"ID":"d4201ca3-b268-4f22-89b7-f74f860bac2e","Type":"ContainerStarted","Data":"e84f675f57b3c3c46672323d501207feb78ef8be4b58b168f4901b587dc6664e"} Apr 21 02:41:27.705877 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.705855 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" event={"ID":"09c18ab9-7867-4a5c-8b6d-a40d3e898daa","Type":"ContainerStarted","Data":"f723afac23a34867333c18ed7fd518f4201de664d7cc2d7b2a9348cba5971fc0"} Apr 21 02:41:27.729272 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.729179 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rb6n9" podStartSLOduration=3.752250921 podStartE2EDuration="20.729159861s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.317224258 +0000 UTC m=+3.250901326" lastFinishedPulling="2026-04-21 02:41:27.294133199 +0000 UTC m=+20.227810266" observedRunningTime="2026-04-21 02:41:27.72832928 +0000 UTC m=+20.662006356" watchObservedRunningTime="2026-04-21 02:41:27.729159861 +0000 UTC m=+20.662836984" Apr 21 02:41:27.758094 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.758050 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2fslz" podStartSLOduration=3.868381076 podStartE2EDuration="20.758037125s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.294001768 +0000 UTC m=+3.227678832" lastFinishedPulling="2026-04-21 02:41:27.18365782 +0000 UTC m=+20.117334881" observedRunningTime="2026-04-21 02:41:27.75797292 +0000 UTC m=+20.691649995" watchObservedRunningTime="2026-04-21 02:41:27.758037125 +0000 UTC m=+20.691714201" Apr 21 02:41:27.758490 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.758462 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m26kz" podStartSLOduration=3.888878654 podStartE2EDuration="20.7584552s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.314067142 +0000 UTC m=+3.247744210" lastFinishedPulling="2026-04-21 02:41:27.183643699 +0000 UTC m=+20.117320756" observedRunningTime="2026-04-21 02:41:27.743680448 +0000 UTC m=+20.677357526" watchObservedRunningTime="2026-04-21 02:41:27.7584552 +0000 UTC m=+20.692132321" Apr 21 02:41:27.771333 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.771262 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5z2x9" podStartSLOduration=3.880019758 podStartE2EDuration="20.771248413s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.292550616 +0000 UTC m=+3.226227671" lastFinishedPulling="2026-04-21 02:41:27.183779259 +0000 UTC m=+20.117456326" observedRunningTime="2026-04-21 02:41:27.7708803 +0000 UTC m=+20.704557376" watchObservedRunningTime="2026-04-21 02:41:27.771248413 +0000 UTC m=+20.704925483" Apr 21 02:41:27.782823 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:27.782784 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vk9x8" podStartSLOduration=3.920217825 podStartE2EDuration="20.782771728s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.321098356 +0000 UTC m=+3.254775414" lastFinishedPulling="2026-04-21 02:41:27.183652253 +0000 UTC m=+20.117329317" observedRunningTime="2026-04-21 02:41:27.78249619 +0000 UTC m=+20.716173263" watchObservedRunningTime="2026-04-21 02:41:27.782771728 +0000 UTC m=+20.716448805" Apr 21 02:41:28.372767 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:28.372739 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 02:41:28.520139 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:28.520024 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T02:41:28.372760615Z","UUID":"3094e72f-b368-4c9a-90ae-e842162c324b","Handler":null,"Name":"","Endpoint":""} Apr 21 02:41:28.521785 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:28.521762 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 02:41:28.521898 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:28.521792 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 02:41:28.709542 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:28.709509 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nkmg8" event={"ID":"e533c790-5675-4c8b-bd06-8c68e6ca4ac0","Type":"ContainerStarted","Data":"6a409a174c7e2268406d58055fd29006950fcb434b89b6a3cd9eceedaa80dae1"} Apr 21 02:41:28.712472 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:28.712418 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" event={"ID":"9c680a54-5b50-40b0-b0c3-514ce8751675","Type":"ContainerStarted","Data":"1b880c2c0e500a58c115b39df6f734a96526877142aae9a71e7b73f6142700a6"} Apr 21 02:41:28.714596 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:28.714528 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" event={"ID":"32453d3a-63ed-4e6e-b2c0-47e7ad529f69","Type":"ContainerStarted","Data":"aa343d75ccb17280734a943321340030b2832ee8a57c8988f1f046ea566aecac"} Apr 21 02:41:28.723196 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:28.723156 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nkmg8" podStartSLOduration=4.85358609 podStartE2EDuration="21.723142802s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.314066103 +0000 UTC m=+3.247743168" lastFinishedPulling="2026-04-21 02:41:27.183622818 +0000 UTC m=+20.117299880" observedRunningTime="2026-04-21 02:41:28.722663612 +0000 UTC m=+21.656340688" watchObservedRunningTime="2026-04-21 02:41:28.723142802 +0000 UTC m=+21.656819878" Apr 21 02:41:29.617606 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:29.617530 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:29.617606 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:29.617586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:29.617868 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:29.617633 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:29.617868 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:29.617732 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:29.617868 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:29.617801 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:29.618000 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:29.617885 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:29.717287 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:29.717254 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" event={"ID":"32453d3a-63ed-4e6e-b2c0-47e7ad529f69","Type":"ContainerStarted","Data":"98e2100785c63ab55343a0e9595623f2fc3c35214a11fb422881d32a1b3aff4d"} Apr 21 02:41:29.732663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:29.732610 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jzpvd" podStartSLOduration=3.86219192 podStartE2EDuration="22.732592973s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.321742195 +0000 UTC m=+3.255419256" lastFinishedPulling="2026-04-21 02:41:29.19214324 +0000 UTC m=+22.125820309" observedRunningTime="2026-04-21 02:41:29.732438122 +0000 UTC m=+22.666115200" watchObservedRunningTime="2026-04-21 02:41:29.732592973 +0000 UTC m=+22.666270050" Apr 21 02:41:30.362061 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:30.362032 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:30.362737 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:30.362716 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:30.721950 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:30.721907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" event={"ID":"9c680a54-5b50-40b0-b0c3-514ce8751675","Type":"ContainerStarted","Data":"16958f2da8607f9f2a42701d14f2e7d085b028a36629cf1951f6c5278c38ba1b"} Apr 21 02:41:30.722401 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:30.722280 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:30.722857 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:30.722837 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2fslz" Apr 21 02:41:31.618093 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:31.618058 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:31.618302 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:31.618122 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:31.618302 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:31.618152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:31.618302 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:31.618255 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:31.618474 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:31.618358 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:31.618474 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:31.618455 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:32.730665 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:32.730341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" event={"ID":"9c680a54-5b50-40b0-b0c3-514ce8751675","Type":"ContainerStarted","Data":"4ef1bc4fb2c33342698f2cfd8b0e338ec9206b35089caea33959edbac1af370a"} Apr 21 02:41:32.731487 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:32.730701 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:32.731487 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:32.730724 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:32.731487 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:32.730735 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:32.732293 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:32.732267 2572 generic.go:358] "Generic (PLEG): container finished" podID="ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f" containerID="7e8614fa59b4e3c53dc5765731c651fd5b1d645d69884e7f561f192fdb970b97" exitCode=0 Apr 21 02:41:32.732432 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:32.732351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" event={"ID":"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f","Type":"ContainerDied","Data":"7e8614fa59b4e3c53dc5765731c651fd5b1d645d69884e7f561f192fdb970b97"} Apr 21 02:41:32.746331 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:32.746309 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:32.746647 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:32.746630 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:41:32.754149 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:32.754115 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" podStartSLOduration=8.807628438 podStartE2EDuration="25.754105066s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.290900324 +0000 UTC m=+3.224577391" lastFinishedPulling="2026-04-21 02:41:27.23737695 +0000 UTC m=+20.171054019" observedRunningTime="2026-04-21 02:41:32.753180779 +0000 UTC m=+25.686857849" watchObservedRunningTime="2026-04-21 02:41:32.754105066 +0000 UTC m=+25.687782141" Apr 21 02:41:33.618126 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.617946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:33.618262 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.617946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:33.618262 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:33.618225 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:33.618346 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.617946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:33.618346 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:33.618278 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:33.618424 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:33.618382 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:33.735567 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.735526 2572 generic.go:358] "Generic (PLEG): container finished" podID="ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f" containerID="eed4550a6ab270ad8a614de2395a8080554004e4bdf7a94f28699077db7eb572" exitCode=0 Apr 21 02:41:33.735876 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.735590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" event={"ID":"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f","Type":"ContainerDied","Data":"eed4550a6ab270ad8a614de2395a8080554004e4bdf7a94f28699077db7eb572"} Apr 21 02:41:33.923840 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.923809 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sn49d"] Apr 21 02:41:33.923994 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.923915 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:33.924066 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:33.924019 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:33.926913 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.926888 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mxtcd"] Apr 21 02:41:33.927031 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.926982 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:33.927110 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:33.927091 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:33.927612 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.927592 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2f9pd"] Apr 21 02:41:33.927700 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:33.927683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:33.927815 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:33.927790 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:34.739680 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:34.739647 2572 generic.go:358] "Generic (PLEG): container finished" podID="ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f" containerID="abf097c769f7680542695558eb73ce6a1a0dec583d157855f7f028fb656b82eb" exitCode=0 Apr 21 02:41:34.740070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:34.739737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" event={"ID":"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f","Type":"ContainerDied","Data":"abf097c769f7680542695558eb73ce6a1a0dec583d157855f7f028fb656b82eb"} Apr 21 02:41:35.617712 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:35.617683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:35.617869 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:35.617681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:35.617869 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:35.617817 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:35.617869 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:35.617683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:35.618005 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:35.617906 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:35.618005 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:35.617993 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:37.618867 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:37.618836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:37.619286 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:37.618914 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:37.619286 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:37.619032 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:37.619286 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:37.619029 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:37.619286 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:37.619130 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:37.619286 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:37.619211 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:39.618132 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:39.617887 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:39.618619 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:39.617887 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:39.618619 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:39.618279 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f9pd" podUID="f3ca7174-0a17-4896-b723-717a079d23e3" Apr 21 02:41:39.618619 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:39.617887 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:39.618619 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:39.618327 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sn49d" podUID="8d474c84-b62f-4695-9dda-3d8d9e6aacb7" Apr 21 02:41:39.618619 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:39.618424 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxtcd" podUID="8c974cfd-58e5-4552-b37f-4c663e11283e" Apr 21 02:41:40.385355 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.385331 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeReady" Apr 21 02:41:40.385532 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.385460 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 02:41:40.427983 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.427959 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5lb4v"] Apr 21 02:41:40.442561 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.442540 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pzlnx"] Apr 21 02:41:40.442710 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.442690 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.444839 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.444817 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 02:41:40.444839 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.444835 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6wzzj\"" Apr 21 02:41:40.444998 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.444915 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 02:41:40.469834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.469811 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5lb4v"] Apr 21 02:41:40.469950 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.469843 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pzlnx"] Apr 21 02:41:40.470007 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.469967 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:40.472228 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.472206 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 02:41:40.472359 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.472339 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lm7rd\"" Apr 21 02:41:40.472470 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.472453 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 02:41:40.472571 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.472549 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 02:41:40.598907 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.598872 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnczz\" (UniqueName: \"kubernetes.io/projected/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-kube-api-access-jnczz\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.599073 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.598925 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-config-volume\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.599073 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.598956 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rwvh\" (UniqueName: \"kubernetes.io/projected/b386dadb-05b0-41e3-8db6-4a3771883f69-kube-api-access-6rwvh\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:40.599073 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.598991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.599073 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.599040 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-tmp-dir\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.599336 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.599095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:40.700065 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.700037 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:40.700378 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.700080 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnczz\" (UniqueName: \"kubernetes.io/projected/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-kube-api-access-jnczz\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.700378 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.700110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-config-volume\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.700378 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.700133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rwvh\" (UniqueName: \"kubernetes.io/projected/b386dadb-05b0-41e3-8db6-4a3771883f69-kube-api-access-6rwvh\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:40.700378 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.700155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.700378 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.700172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-tmp-dir\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.700546 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.700470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-tmp-dir\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.700587 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:40.700555 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:40.700620 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:40.700595 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert podName:b386dadb-05b0-41e3-8db6-4a3771883f69 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:41.200582257 +0000 UTC m=+34.134259312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert") pod "ingress-canary-pzlnx" (UID: "b386dadb-05b0-41e3-8db6-4a3771883f69") : secret "canary-serving-cert" not found Apr 21 02:41:40.700860 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:40.700833 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:40.700974 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:40.700898 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls podName:7d9a58cd-c703-4352-b873-ebc3e5cc1cfd nodeName:}" failed. No retries permitted until 2026-04-21 02:41:41.200881403 +0000 UTC m=+34.134558470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls") pod "dns-default-5lb4v" (UID: "7d9a58cd-c703-4352-b873-ebc3e5cc1cfd") : secret "dns-default-metrics-tls" not found Apr 21 02:41:40.701248 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.701210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-config-volume\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.709860 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.709838 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnczz\" (UniqueName: \"kubernetes.io/projected/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-kube-api-access-jnczz\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:40.709940 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.709880 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rwvh\" (UniqueName: \"kubernetes.io/projected/b386dadb-05b0-41e3-8db6-4a3771883f69-kube-api-access-6rwvh\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:40.751414 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.751379 2572 generic.go:358] "Generic (PLEG): container finished" podID="ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f" containerID="c685f4b19a3f9c89bfe5aa8f8e6cc5ccfc8a308db9caa261c6464d6eacfed321" exitCode=0 Apr 21 02:41:40.751526 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:40.751455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" event={"ID":"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f","Type":"ContainerDied","Data":"c685f4b19a3f9c89bfe5aa8f8e6cc5ccfc8a308db9caa261c6464d6eacfed321"} Apr 21 02:41:41.204185 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.204093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:41.204185 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.204145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:41.204455 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.204268 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:41.204455 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.204272 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:41.204455 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.204327 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert podName:b386dadb-05b0-41e3-8db6-4a3771883f69 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:42.204312127 +0000 UTC m=+35.137989181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert") pod "ingress-canary-pzlnx" (UID: "b386dadb-05b0-41e3-8db6-4a3771883f69") : secret "canary-serving-cert" not found Apr 21 02:41:41.204455 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.204354 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls podName:7d9a58cd-c703-4352-b873-ebc3e5cc1cfd nodeName:}" failed. No retries permitted until 2026-04-21 02:41:42.204332298 +0000 UTC m=+35.138009357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls") pod "dns-default-5lb4v" (UID: "7d9a58cd-c703-4352-b873-ebc3e5cc1cfd") : secret "dns-default-metrics-tls" not found Apr 21 02:41:41.305069 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.305032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:41.305208 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.305086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:41.305208 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.305168 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:41.305208 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.305190 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:41.305356 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.305225 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret podName:8d474c84-b62f-4695-9dda-3d8d9e6aacb7 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:13.305211848 +0000 UTC m=+66.238888902 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret") pod "global-pull-secret-syncer-sn49d" (UID: "8d474c84-b62f-4695-9dda-3d8d9e6aacb7") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:41.305356 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.305295 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs podName:f3ca7174-0a17-4896-b723-717a079d23e3 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:13.305277514 +0000 UTC m=+66.238954572 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs") pod "network-metrics-daemon-2f9pd" (UID: "f3ca7174-0a17-4896-b723-717a079d23e3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:41.406085 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.406055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:41.406202 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.406187 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:41.406256 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.406207 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:41.406256 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.406216 2572 projected.go:194] Error preparing data for projected volume kube-api-access-crsqc for pod openshift-network-diagnostics/network-check-target-mxtcd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:41.406319 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:41.406284 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc podName:8c974cfd-58e5-4552-b37f-4c663e11283e nodeName:}" failed. No retries permitted until 2026-04-21 02:42:13.406269896 +0000 UTC m=+66.339946954 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-crsqc" (UniqueName: "kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc") pod "network-check-target-mxtcd" (UID: "8c974cfd-58e5-4552-b37f-4c663e11283e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:41.617884 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.617803 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:41:41.618118 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.617809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:41:41.618118 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.617812 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:41:41.620125 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.620088 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 02:41:41.620125 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.620096 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 02:41:41.620335 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.620185 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nsvjd\"" Apr 21 02:41:41.620335 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.620273 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 02:41:41.620446 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.620389 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j2fpz\"" Apr 21 02:41:41.620799 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.620782 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 02:41:41.755656 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.755627 2572 generic.go:358] "Generic (PLEG): container finished" podID="ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f" containerID="32c0a9820542e02c4fb77e0553ef371faa7623afab91d5c3a448269753d00316" exitCode=0 Apr 21 02:41:41.756368 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:41.755672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" event={"ID":"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f","Type":"ContainerDied","Data":"32c0a9820542e02c4fb77e0553ef371faa7623afab91d5c3a448269753d00316"} Apr 21 02:41:42.213557 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:42.213466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:42.213557 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:42.213519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:42.213789 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:42.213606 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:42.213789 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:42.213609 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:42.213789 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:42.213668 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert podName:b386dadb-05b0-41e3-8db6-4a3771883f69 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:44.213654008 +0000 UTC m=+37.147331063 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert") pod "ingress-canary-pzlnx" (UID: "b386dadb-05b0-41e3-8db6-4a3771883f69") : secret "canary-serving-cert" not found Apr 21 02:41:42.213789 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:42.213683 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls podName:7d9a58cd-c703-4352-b873-ebc3e5cc1cfd nodeName:}" failed. No retries permitted until 2026-04-21 02:41:44.213677393 +0000 UTC m=+37.147354447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls") pod "dns-default-5lb4v" (UID: "7d9a58cd-c703-4352-b873-ebc3e5cc1cfd") : secret "dns-default-metrics-tls" not found Apr 21 02:41:42.759921 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:42.759879 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" event={"ID":"ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f","Type":"ContainerStarted","Data":"4b680364aed3b8f4c0db2bb00d3b2b6379679ba054efecd4a504fc42077a7187"} Apr 21 02:41:42.780425 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:42.780227 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-h5dpc" podStartSLOduration=5.683714276 podStartE2EDuration="35.780214197s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:41:10.321783566 +0000 UTC m=+3.255460640" lastFinishedPulling="2026-04-21 02:41:40.418283507 +0000 UTC m=+33.351960561" observedRunningTime="2026-04-21 02:41:42.77873476 +0000 UTC m=+35.712411848" watchObservedRunningTime="2026-04-21 02:41:42.780214197 +0000 UTC m=+35.713891267" Apr 21 02:41:44.227750 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:44.227709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:44.228182 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:44.227774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:44.228182 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:44.227861 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:44.228182 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:44.227878 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:44.228182 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:44.227928 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert podName:b386dadb-05b0-41e3-8db6-4a3771883f69 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:48.22791366 +0000 UTC m=+41.161590717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert") pod "ingress-canary-pzlnx" (UID: "b386dadb-05b0-41e3-8db6-4a3771883f69") : secret "canary-serving-cert" not found Apr 21 02:41:44.228182 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:44.227941 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls podName:7d9a58cd-c703-4352-b873-ebc3e5cc1cfd nodeName:}" failed. No retries permitted until 2026-04-21 02:41:48.227936189 +0000 UTC m=+41.161613243 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls") pod "dns-default-5lb4v" (UID: "7d9a58cd-c703-4352-b873-ebc3e5cc1cfd") : secret "dns-default-metrics-tls" not found Apr 21 02:41:48.254806 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:48.254773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:48.255199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:48.254817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:48.255199 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:48.254910 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:48.255199 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:48.254951 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert podName:b386dadb-05b0-41e3-8db6-4a3771883f69 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:56.254938552 +0000 UTC m=+49.188615611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert") pod "ingress-canary-pzlnx" (UID: "b386dadb-05b0-41e3-8db6-4a3771883f69") : secret "canary-serving-cert" not found Apr 21 02:41:48.255199 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:48.254910 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:48.255199 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:48.254979 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls podName:7d9a58cd-c703-4352-b873-ebc3e5cc1cfd nodeName:}" failed. No retries permitted until 2026-04-21 02:41:56.254973537 +0000 UTC m=+49.188650591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls") pod "dns-default-5lb4v" (UID: "7d9a58cd-c703-4352-b873-ebc3e5cc1cfd") : secret "dns-default-metrics-tls" not found Apr 21 02:41:56.310196 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:56.310154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:41:56.310603 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:41:56.310205 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:41:56.310603 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:56.310311 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:56.310603 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:56.310334 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:56.310603 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:56.310364 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert podName:b386dadb-05b0-41e3-8db6-4a3771883f69 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:12.310350446 +0000 UTC m=+65.244027501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert") pod "ingress-canary-pzlnx" (UID: "b386dadb-05b0-41e3-8db6-4a3771883f69") : secret "canary-serving-cert" not found Apr 21 02:41:56.310603 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:41:56.310401 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls podName:7d9a58cd-c703-4352-b873-ebc3e5cc1cfd nodeName:}" failed. No retries permitted until 2026-04-21 02:42:12.310388056 +0000 UTC m=+65.244065110 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls") pod "dns-default-5lb4v" (UID: "7d9a58cd-c703-4352-b873-ebc3e5cc1cfd") : secret "dns-default-metrics-tls" not found Apr 21 02:42:04.749867 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:04.749838 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckmv8" Apr 21 02:42:12.314458 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:12.314395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:42:12.314951 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:12.314510 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:42:12.314951 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:12.314547 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:42:12.314951 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:12.314627 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls podName:7d9a58cd-c703-4352-b873-ebc3e5cc1cfd nodeName:}" failed. No retries permitted until 2026-04-21 02:42:44.314603284 +0000 UTC m=+97.248280379 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls") pod "dns-default-5lb4v" (UID: "7d9a58cd-c703-4352-b873-ebc3e5cc1cfd") : secret "dns-default-metrics-tls" not found Apr 21 02:42:12.314951 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:12.314699 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:42:12.314951 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:12.314744 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert podName:b386dadb-05b0-41e3-8db6-4a3771883f69 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:44.31472986 +0000 UTC m=+97.248406929 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert") pod "ingress-canary-pzlnx" (UID: "b386dadb-05b0-41e3-8db6-4a3771883f69") : secret "canary-serving-cert" not found Apr 21 02:42:13.320931 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.320889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:42:13.321411 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.320952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:42:13.323426 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.323401 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 02:42:13.323529 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.323459 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 02:42:13.332094 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:13.332073 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 02:42:13.332195 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:13.332142 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs podName:f3ca7174-0a17-4896-b723-717a079d23e3 nodeName:}" failed. No retries permitted until 2026-04-21 02:43:17.332121065 +0000 UTC m=+130.265798124 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs") pod "network-metrics-daemon-2f9pd" (UID: "f3ca7174-0a17-4896-b723-717a079d23e3") : secret "metrics-daemon-secret" not found Apr 21 02:42:13.334347 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.334329 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d474c84-b62f-4695-9dda-3d8d9e6aacb7-original-pull-secret\") pod \"global-pull-secret-syncer-sn49d\" (UID: \"8d474c84-b62f-4695-9dda-3d8d9e6aacb7\") " pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:42:13.421911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.421865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:42:13.424244 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.424220 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 02:42:13.433064 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.433043 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sn49d" Apr 21 02:42:13.434950 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.434748 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 02:42:13.445994 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.445970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crsqc\" (UniqueName: \"kubernetes.io/projected/8c974cfd-58e5-4552-b37f-4c663e11283e-kube-api-access-crsqc\") pod \"network-check-target-mxtcd\" (UID: \"8c974cfd-58e5-4552-b37f-4c663e11283e\") " pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:42:13.552932 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.552904 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sn49d"] Apr 21 02:42:13.557674 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:13.557647 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d474c84_b62f_4695_9dda_3d8d9e6aacb7.slice/crio-880304871ae5b3e4faad745c01b91fdd0ee52faa31bcd707e6c26c706c440be6 WatchSource:0}: Error finding container 880304871ae5b3e4faad745c01b91fdd0ee52faa31bcd707e6c26c706c440be6: Status 404 returned error can't find the container with id 880304871ae5b3e4faad745c01b91fdd0ee52faa31bcd707e6c26c706c440be6 Apr 21 02:42:13.729926 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.729904 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nsvjd\"" Apr 21 02:42:13.738612 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.738593 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:42:13.820291 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.819991 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sn49d" event={"ID":"8d474c84-b62f-4695-9dda-3d8d9e6aacb7","Type":"ContainerStarted","Data":"880304871ae5b3e4faad745c01b91fdd0ee52faa31bcd707e6c26c706c440be6"} Apr 21 02:42:13.864455 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:13.864428 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mxtcd"] Apr 21 02:42:13.867588 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:13.867558 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c974cfd_58e5_4552_b37f_4c663e11283e.slice/crio-15f4c82e13fb21698e8caca72345b43de5a48cab042f19b24dd23139d520e19f WatchSource:0}: Error finding container 15f4c82e13fb21698e8caca72345b43de5a48cab042f19b24dd23139d520e19f: Status 404 returned error can't find the container with id 15f4c82e13fb21698e8caca72345b43de5a48cab042f19b24dd23139d520e19f Apr 21 02:42:14.822994 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:14.822947 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mxtcd" event={"ID":"8c974cfd-58e5-4552-b37f-4c663e11283e","Type":"ContainerStarted","Data":"15f4c82e13fb21698e8caca72345b43de5a48cab042f19b24dd23139d520e19f"} Apr 21 02:42:16.994971 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:16.994936 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7bc899fb4-fbmpx"] Apr 21 02:42:16.997981 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:16.997956 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.000256 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.000201 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 02:42:17.000256 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.000215 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 02:42:17.000256 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.000253 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 02:42:17.000471 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.000283 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 02:42:17.000471 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.000219 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 02:42:17.000471 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.000397 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-4dfs2\"" Apr 21 02:42:17.000471 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.000428 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 02:42:17.011289 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.011261 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bc899fb4-fbmpx"] Apr 21 02:42:17.050703 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.050676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-default-certificate\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.050864 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.050739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.050864 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.050770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.050864 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.050796 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-stats-auth\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.051004 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.050872 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkhzq\" (UniqueName: \"kubernetes.io/projected/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-kube-api-access-bkhzq\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.151883 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.151851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-default-certificate\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.152067 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.151900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.152067 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.151932 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.152067 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.151964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-stats-auth\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.152067 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.152007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkhzq\" (UniqueName: \"kubernetes.io/projected/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-kube-api-access-bkhzq\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.152302 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:17.152067 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:17.652045585 +0000 UTC m=+70.585722641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : configmap references non-existent config key: service-ca.crt Apr 21 02:42:17.152302 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:17.152105 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 02:42:17.152302 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:17.152163 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:17.65214659 +0000 UTC m=+70.585823644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : secret "router-metrics-certs-default" not found Apr 21 02:42:17.154712 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.154688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-stats-auth\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.154834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.154688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-default-certificate\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.160111 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.160086 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkhzq\" (UniqueName: \"kubernetes.io/projected/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-kube-api-access-bkhzq\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.655635 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.655605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.655830 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:17.655648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:17.655830 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:17.655740 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 02:42:17.655830 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:17.655790 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:18.655773215 +0000 UTC m=+71.589450280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : secret "router-metrics-certs-default" not found Apr 21 02:42:17.655830 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:17.655814 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:18.65580538 +0000 UTC m=+71.589482442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : configmap references non-existent config key: service-ca.crt Apr 21 02:42:18.663780 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:18.663723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:18.663780 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:18.663785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:18.664274 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:18.663899 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:20.66387796 +0000 UTC m=+73.597555014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : configmap references non-existent config key: service-ca.crt Apr 21 02:42:18.664274 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:18.663920 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 02:42:18.664274 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:18.663976 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:20.663960478 +0000 UTC m=+73.597637549 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : secret "router-metrics-certs-default" not found Apr 21 02:42:18.833038 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:18.833004 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mxtcd" event={"ID":"8c974cfd-58e5-4552-b37f-4c663e11283e","Type":"ContainerStarted","Data":"99287161e425860631da86c1506828582f9495e091f0ac1c78aa50874a8ccfbe"} Apr 21 02:42:18.833195 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:18.833098 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:42:18.834258 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:18.834222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sn49d" event={"ID":"8d474c84-b62f-4695-9dda-3d8d9e6aacb7","Type":"ContainerStarted","Data":"cff52df1749497e23b3c0474c3fea865ba4a87c5f21234863afc486b22f18ba1"} Apr 21 02:42:18.846969 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:18.846928 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mxtcd" podStartSLOduration=67.638534598 podStartE2EDuration="1m11.84691729s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:42:13.869360763 +0000 UTC m=+66.803037817" lastFinishedPulling="2026-04-21 02:42:18.077743442 +0000 UTC m=+71.011420509" observedRunningTime="2026-04-21 02:42:18.846352682 +0000 UTC m=+71.780029771" watchObservedRunningTime="2026-04-21 02:42:18.84691729 +0000 UTC m=+71.780594366" Apr 21 02:42:18.858101 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:18.858068 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-sn49d" podStartSLOduration=66.334944306 podStartE2EDuration="1m10.858058669s" podCreationTimestamp="2026-04-21 02:41:08 +0000 UTC" firstStartedPulling="2026-04-21 02:42:13.559315548 +0000 UTC m=+66.492992602" lastFinishedPulling="2026-04-21 02:42:18.082429906 +0000 UTC m=+71.016106965" observedRunningTime="2026-04-21 02:42:18.858014289 +0000 UTC m=+71.791691365" watchObservedRunningTime="2026-04-21 02:42:18.858058669 +0000 UTC m=+71.791735723" Apr 21 02:42:20.679628 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:20.679580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:20.679628 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:20.679638 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:20.680042 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:20.679747 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:24.679730067 +0000 UTC m=+77.613407125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : configmap references non-existent config key: service-ca.crt Apr 21 02:42:20.680042 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:20.679760 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 02:42:20.680042 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:20.679802 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:24.679789041 +0000 UTC m=+77.613466094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : secret "router-metrics-certs-default" not found Apr 21 02:42:22.756016 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:22.755992 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m26kz_55cefcc2-8412-4791-ab29-e4fbdd117f4a/dns-node-resolver/0.log" Apr 21 02:42:23.560136 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:23.560106 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vk9x8_17e20017-572f-4919-8639-2e7007feee0b/node-ca/0.log" Apr 21 02:42:24.708091 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:24.708047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:24.708091 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:24.708100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:24.708543 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:24.708218 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:32.708200742 +0000 UTC m=+85.641877809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : configmap references non-existent config key: service-ca.crt Apr 21 02:42:24.708543 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:24.708221 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 02:42:24.708543 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:24.708275 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:32.708266813 +0000 UTC m=+85.641943866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : secret "router-metrics-certs-default" not found Apr 21 02:42:27.137048 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.136910 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799"] Apr 21 02:42:27.138864 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.138846 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8jgxr"] Apr 21 02:42:27.139021 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.139002 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.141177 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.141157 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl"] Apr 21 02:42:27.141294 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.141226 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 02:42:27.141361 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.141331 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.141423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.141403 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:42:27.141537 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.141521 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 02:42:27.141602 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.141555 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 02:42:27.141602 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.141580 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-92nbv\"" Apr 21 02:42:27.143121 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.143107 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 02:42:27.144154 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.143742 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:27.144154 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.143783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 02:42:27.144154 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.143854 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-bj7h7\"" Apr 21 02:42:27.145297 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.144373 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:42:27.145297 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.144588 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 02:42:27.146803 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.146031 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-6zmzk\"" Apr 21 02:42:27.146803 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.146389 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 02:42:27.146803 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.146495 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:42:27.146803 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.146682 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 02:42:27.151841 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.151816 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799"] Apr 21 02:42:27.152866 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.152843 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 02:42:27.152959 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.152882 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl"] Apr 21 02:42:27.153573 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.153554 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8jgxr"] Apr 21 02:42:27.224401 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.224364 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ts98\" (UniqueName: \"kubernetes.io/projected/b7aaabb9-fee1-4cd6-9c82-badd547250ae-kube-api-access-8ts98\") pod \"kube-storage-version-migrator-operator-6769c5d45-z9799\" (UID: \"b7aaabb9-fee1-4cd6-9c82-badd547250ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.224565 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.224435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9246de05-b865-4564-bf19-9e73a72a4969-trusted-ca\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.224565 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.224479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:27.224565 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.224555 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7aaabb9-fee1-4cd6-9c82-badd547250ae-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z9799\" (UID: \"b7aaabb9-fee1-4cd6-9c82-badd547250ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.224760 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.224595 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wl9f\" (UniqueName: \"kubernetes.io/projected/9246de05-b865-4564-bf19-9e73a72a4969-kube-api-access-8wl9f\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.224760 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.224612 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntmn8\" (UniqueName: \"kubernetes.io/projected/8b753ead-78d6-4777-9a4f-dc30f2929420-kube-api-access-ntmn8\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:27.224760 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.224632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7aaabb9-fee1-4cd6-9c82-badd547250ae-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z9799\" (UID: \"b7aaabb9-fee1-4cd6-9c82-badd547250ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.224760 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.224674 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9246de05-b865-4564-bf19-9e73a72a4969-serving-cert\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.224760 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.224692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9246de05-b865-4564-bf19-9e73a72a4969-config\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.325641 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.325612 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9246de05-b865-4564-bf19-9e73a72a4969-serving-cert\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.325641 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.325644 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9246de05-b865-4564-bf19-9e73a72a4969-config\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.325855 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.325667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ts98\" (UniqueName: \"kubernetes.io/projected/b7aaabb9-fee1-4cd6-9c82-badd547250ae-kube-api-access-8ts98\") pod \"kube-storage-version-migrator-operator-6769c5d45-z9799\" (UID: \"b7aaabb9-fee1-4cd6-9c82-badd547250ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.325855 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.325732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9246de05-b865-4564-bf19-9e73a72a4969-trusted-ca\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.325855 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.325756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:27.325855 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.325787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7aaabb9-fee1-4cd6-9c82-badd547250ae-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z9799\" (UID: \"b7aaabb9-fee1-4cd6-9c82-badd547250ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.325855 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.325830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wl9f\" (UniqueName: \"kubernetes.io/projected/9246de05-b865-4564-bf19-9e73a72a4969-kube-api-access-8wl9f\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.325855 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.325851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntmn8\" (UniqueName: \"kubernetes.io/projected/8b753ead-78d6-4777-9a4f-dc30f2929420-kube-api-access-ntmn8\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:27.326126 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.325876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7aaabb9-fee1-4cd6-9c82-badd547250ae-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z9799\" (UID: \"b7aaabb9-fee1-4cd6-9c82-badd547250ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.326126 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:27.326097 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 02:42:27.326228 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:27.326171 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls podName:8b753ead-78d6-4777-9a4f-dc30f2929420 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:27.826146694 +0000 UTC m=+80.759823768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g6wpl" (UID: "8b753ead-78d6-4777-9a4f-dc30f2929420") : secret "samples-operator-tls" not found Apr 21 02:42:27.326497 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.326460 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7aaabb9-fee1-4cd6-9c82-badd547250ae-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z9799\" (UID: \"b7aaabb9-fee1-4cd6-9c82-badd547250ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.326594 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.326514 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9246de05-b865-4564-bf19-9e73a72a4969-config\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.326900 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.326877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9246de05-b865-4564-bf19-9e73a72a4969-trusted-ca\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.328089 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.328061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7aaabb9-fee1-4cd6-9c82-badd547250ae-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z9799\" (UID: \"b7aaabb9-fee1-4cd6-9c82-badd547250ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.328173 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.328096 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9246de05-b865-4564-bf19-9e73a72a4969-serving-cert\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.335715 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.335686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ts98\" (UniqueName: \"kubernetes.io/projected/b7aaabb9-fee1-4cd6-9c82-badd547250ae-kube-api-access-8ts98\") pod \"kube-storage-version-migrator-operator-6769c5d45-z9799\" (UID: \"b7aaabb9-fee1-4cd6-9c82-badd547250ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.335832 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.335814 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntmn8\" (UniqueName: \"kubernetes.io/projected/8b753ead-78d6-4777-9a4f-dc30f2929420-kube-api-access-ntmn8\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:27.335896 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.335882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wl9f\" (UniqueName: \"kubernetes.io/projected/9246de05-b865-4564-bf19-9e73a72a4969-kube-api-access-8wl9f\") pod \"console-operator-9d4b6777b-8jgxr\" (UID: \"9246de05-b865-4564-bf19-9e73a72a4969\") " pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.454779 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.454755 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" Apr 21 02:42:27.460496 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.460478 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:27.573140 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.573104 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799"] Apr 21 02:42:27.576037 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:27.576010 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7aaabb9_fee1_4cd6_9c82_badd547250ae.slice/crio-e7506cab12ee1e8198c1e193f789881d72e8b39e22f17a666073fcc4bacbaa4f WatchSource:0}: Error finding container e7506cab12ee1e8198c1e193f789881d72e8b39e22f17a666073fcc4bacbaa4f: Status 404 returned error can't find the container with id e7506cab12ee1e8198c1e193f789881d72e8b39e22f17a666073fcc4bacbaa4f Apr 21 02:42:27.589637 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.589614 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8jgxr"] Apr 21 02:42:27.592315 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:27.592277 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9246de05_b865_4564_bf19_9e73a72a4969.slice/crio-5598d96613b6eaeecc0883cd890efea4c54ad44e3c7383c907451e99a3a3f88e WatchSource:0}: Error finding container 5598d96613b6eaeecc0883cd890efea4c54ad44e3c7383c907451e99a3a3f88e: Status 404 returned error can't find the container with id 5598d96613b6eaeecc0883cd890efea4c54ad44e3c7383c907451e99a3a3f88e Apr 21 02:42:27.830221 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.830123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:27.830385 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:27.830306 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 02:42:27.830385 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:27.830370 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls podName:8b753ead-78d6-4777-9a4f-dc30f2929420 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:28.830354711 +0000 UTC m=+81.764031768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g6wpl" (UID: "8b753ead-78d6-4777-9a4f-dc30f2929420") : secret "samples-operator-tls" not found Apr 21 02:42:27.851799 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.851767 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" event={"ID":"9246de05-b865-4564-bf19-9e73a72a4969","Type":"ContainerStarted","Data":"5598d96613b6eaeecc0883cd890efea4c54ad44e3c7383c907451e99a3a3f88e"} Apr 21 02:42:27.852722 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:27.852699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" event={"ID":"b7aaabb9-fee1-4cd6-9c82-badd547250ae","Type":"ContainerStarted","Data":"e7506cab12ee1e8198c1e193f789881d72e8b39e22f17a666073fcc4bacbaa4f"} Apr 21 02:42:28.837974 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:28.837933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:28.838466 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:28.838115 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 02:42:28.838466 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:28.838208 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls podName:8b753ead-78d6-4777-9a4f-dc30f2929420 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:30.838181235 +0000 UTC m=+83.771858333 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g6wpl" (UID: "8b753ead-78d6-4777-9a4f-dc30f2929420") : secret "samples-operator-tls" not found Apr 21 02:42:30.854783 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:30.854747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:30.855185 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:30.854895 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 02:42:30.855185 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:30.854960 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls podName:8b753ead-78d6-4777-9a4f-dc30f2929420 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:34.854946089 +0000 UTC m=+87.788623142 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g6wpl" (UID: "8b753ead-78d6-4777-9a4f-dc30f2929420") : secret "samples-operator-tls" not found Apr 21 02:42:30.860684 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:30.860655 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" event={"ID":"b7aaabb9-fee1-4cd6-9c82-badd547250ae","Type":"ContainerStarted","Data":"dc90b891be2dd5d3644a93cb3142bebab437bcaf87f720dfaa8ae510ec9b095b"} Apr 21 02:42:30.862146 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:30.862126 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/0.log" Apr 21 02:42:30.862302 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:30.862163 2572 generic.go:358] "Generic (PLEG): container finished" podID="9246de05-b865-4564-bf19-9e73a72a4969" containerID="be954ed1699d41290ac420667d8fae3a0d938d1fb3c19575262ba011c5beb205" exitCode=255 Apr 21 02:42:30.862302 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:30.862189 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" event={"ID":"9246de05-b865-4564-bf19-9e73a72a4969","Type":"ContainerDied","Data":"be954ed1699d41290ac420667d8fae3a0d938d1fb3c19575262ba011c5beb205"} Apr 21 02:42:30.862413 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:30.862375 2572 scope.go:117] "RemoveContainer" containerID="be954ed1699d41290ac420667d8fae3a0d938d1fb3c19575262ba011c5beb205" Apr 21 02:42:30.874059 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:30.874013 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" podStartSLOduration=1.592709315 podStartE2EDuration="3.874000066s" podCreationTimestamp="2026-04-21 02:42:27 +0000 UTC" firstStartedPulling="2026-04-21 02:42:27.577838327 +0000 UTC m=+80.511515381" lastFinishedPulling="2026-04-21 02:42:29.859129067 +0000 UTC m=+82.792806132" observedRunningTime="2026-04-21 02:42:30.873648484 +0000 UTC m=+83.807325558" watchObservedRunningTime="2026-04-21 02:42:30.874000066 +0000 UTC m=+83.807677141" Apr 21 02:42:31.869762 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:31.869731 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:42:31.870164 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:31.870108 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/0.log" Apr 21 02:42:31.870164 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:31.870139 2572 generic.go:358] "Generic (PLEG): container finished" podID="9246de05-b865-4564-bf19-9e73a72a4969" containerID="0d91188c85ac1e814dcb024ebbdb44533935d31d3ce8ff0202f82b0219492074" exitCode=255 Apr 21 02:42:31.870252 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:31.870167 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" event={"ID":"9246de05-b865-4564-bf19-9e73a72a4969","Type":"ContainerDied","Data":"0d91188c85ac1e814dcb024ebbdb44533935d31d3ce8ff0202f82b0219492074"} Apr 21 02:42:31.870252 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:31.870211 2572 scope.go:117] "RemoveContainer" containerID="be954ed1699d41290ac420667d8fae3a0d938d1fb3c19575262ba011c5beb205" Apr 21 02:42:31.870509 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:31.870487 2572 scope.go:117] "RemoveContainer" containerID="0d91188c85ac1e814dcb024ebbdb44533935d31d3ce8ff0202f82b0219492074" Apr 21 02:42:31.870699 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:31.870676 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8jgxr_openshift-console-operator(9246de05-b865-4564-bf19-9e73a72a4969)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" podUID="9246de05-b865-4564-bf19-9e73a72a4969" Apr 21 02:42:32.770090 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:32.770059 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:32.770265 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:32.770099 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:32.770265 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:32.770224 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 02:42:32.770265 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:32.770249 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:48.77021681 +0000 UTC m=+101.703893880 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : configmap references non-existent config key: service-ca.crt Apr 21 02:42:32.770378 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:32.770289 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs podName:c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b nodeName:}" failed. No retries permitted until 2026-04-21 02:42:48.770281359 +0000 UTC m=+101.703958418 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs") pod "router-default-7bc899fb4-fbmpx" (UID: "c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b") : secret "router-metrics-certs-default" not found Apr 21 02:42:32.873501 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:32.873476 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:42:32.873887 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:32.873811 2572 scope.go:117] "RemoveContainer" containerID="0d91188c85ac1e814dcb024ebbdb44533935d31d3ce8ff0202f82b0219492074" Apr 21 02:42:32.873985 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:32.873968 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8jgxr_openshift-console-operator(9246de05-b865-4564-bf19-9e73a72a4969)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" podUID="9246de05-b865-4564-bf19-9e73a72a4969" Apr 21 02:42:33.034073 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.033998 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sm629"] Apr 21 02:42:33.038140 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.038125 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.040050 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.040017 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-xl5zv\"" Apr 21 02:42:33.040050 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.040031 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 02:42:33.040319 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.040302 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 02:42:33.040388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.040325 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 02:42:33.040388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.040305 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 02:42:33.042783 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.042764 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sm629"] Apr 21 02:42:33.072363 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.072338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a31898f9-d77c-46a9-b271-d110d424947c-signing-cabundle\") pod \"service-ca-865cb79987-sm629\" (UID: \"a31898f9-d77c-46a9-b271-d110d424947c\") " pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.072504 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.072387 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqtkq\" (UniqueName: \"kubernetes.io/projected/a31898f9-d77c-46a9-b271-d110d424947c-kube-api-access-qqtkq\") pod \"service-ca-865cb79987-sm629\" (UID: \"a31898f9-d77c-46a9-b271-d110d424947c\") " pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.072504 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.072407 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a31898f9-d77c-46a9-b271-d110d424947c-signing-key\") pod \"service-ca-865cb79987-sm629\" (UID: \"a31898f9-d77c-46a9-b271-d110d424947c\") " pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.173344 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.173314 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqtkq\" (UniqueName: \"kubernetes.io/projected/a31898f9-d77c-46a9-b271-d110d424947c-kube-api-access-qqtkq\") pod \"service-ca-865cb79987-sm629\" (UID: \"a31898f9-d77c-46a9-b271-d110d424947c\") " pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.173500 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.173354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a31898f9-d77c-46a9-b271-d110d424947c-signing-key\") pod \"service-ca-865cb79987-sm629\" (UID: \"a31898f9-d77c-46a9-b271-d110d424947c\") " pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.173500 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.173451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a31898f9-d77c-46a9-b271-d110d424947c-signing-cabundle\") pod \"service-ca-865cb79987-sm629\" (UID: \"a31898f9-d77c-46a9-b271-d110d424947c\") " pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.174060 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.174037 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a31898f9-d77c-46a9-b271-d110d424947c-signing-cabundle\") pod \"service-ca-865cb79987-sm629\" (UID: \"a31898f9-d77c-46a9-b271-d110d424947c\") " pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.176170 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.176151 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a31898f9-d77c-46a9-b271-d110d424947c-signing-key\") pod \"service-ca-865cb79987-sm629\" (UID: \"a31898f9-d77c-46a9-b271-d110d424947c\") " pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.180211 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.180187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqtkq\" (UniqueName: \"kubernetes.io/projected/a31898f9-d77c-46a9-b271-d110d424947c-kube-api-access-qqtkq\") pod \"service-ca-865cb79987-sm629\" (UID: \"a31898f9-d77c-46a9-b271-d110d424947c\") " pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.347260 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.347154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sm629" Apr 21 02:42:33.454064 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.454035 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sm629"] Apr 21 02:42:33.457185 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:33.457155 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda31898f9_d77c_46a9_b271_d110d424947c.slice/crio-897f80853308b62ccdefbe49949d4a775b6209a6ec8ad378226efc94f1dd478a WatchSource:0}: Error finding container 897f80853308b62ccdefbe49949d4a775b6209a6ec8ad378226efc94f1dd478a: Status 404 returned error can't find the container with id 897f80853308b62ccdefbe49949d4a775b6209a6ec8ad378226efc94f1dd478a Apr 21 02:42:33.876747 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:33.876710 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sm629" event={"ID":"a31898f9-d77c-46a9-b271-d110d424947c","Type":"ContainerStarted","Data":"897f80853308b62ccdefbe49949d4a775b6209a6ec8ad378226efc94f1dd478a"} Apr 21 02:42:34.888081 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:34.888035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:34.888579 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:34.888213 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 02:42:34.888579 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:34.888306 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls podName:8b753ead-78d6-4777-9a4f-dc30f2929420 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:42.88828229 +0000 UTC m=+95.821959345 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g6wpl" (UID: "8b753ead-78d6-4777-9a4f-dc30f2929420") : secret "samples-operator-tls" not found Apr 21 02:42:35.882801 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:35.882764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sm629" event={"ID":"a31898f9-d77c-46a9-b271-d110d424947c","Type":"ContainerStarted","Data":"47e190b2e0ea8d1c45dce46248afefe820e9591120b31551e299bd504d905825"} Apr 21 02:42:35.897185 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:35.897128 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-sm629" podStartSLOduration=1.35701368 podStartE2EDuration="2.897111148s" podCreationTimestamp="2026-04-21 02:42:33 +0000 UTC" firstStartedPulling="2026-04-21 02:42:33.459103381 +0000 UTC m=+86.392780439" lastFinishedPulling="2026-04-21 02:42:34.999200849 +0000 UTC m=+87.932877907" observedRunningTime="2026-04-21 02:42:35.895987213 +0000 UTC m=+88.829664350" watchObservedRunningTime="2026-04-21 02:42:35.897111148 +0000 UTC m=+88.830788223" Apr 21 02:42:37.461081 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:37.461046 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:37.461081 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:37.461074 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:37.461469 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:37.461398 2572 scope.go:117] "RemoveContainer" containerID="0d91188c85ac1e814dcb024ebbdb44533935d31d3ce8ff0202f82b0219492074" Apr 21 02:42:37.461567 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:42:37.461550 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8jgxr_openshift-console-operator(9246de05-b865-4564-bf19-9e73a72a4969)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" podUID="9246de05-b865-4564-bf19-9e73a72a4969" Apr 21 02:42:42.956798 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:42.956766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:42.959255 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:42.959211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b753ead-78d6-4777-9a4f-dc30f2929420-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g6wpl\" (UID: \"8b753ead-78d6-4777-9a4f-dc30f2929420\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:43.065332 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:43.065291 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" Apr 21 02:42:43.181257 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:43.181210 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl"] Apr 21 02:42:43.899810 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:43.899774 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" event={"ID":"8b753ead-78d6-4777-9a4f-dc30f2929420","Type":"ContainerStarted","Data":"b45597df66218322e36e2864ef1200862bd7ae06a87bea4cab6d1cc22dd5c59f"} Apr 21 02:42:44.368709 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.368623 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:42:44.369117 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.368739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:42:44.371656 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.371632 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d9a58cd-c703-4352-b873-ebc3e5cc1cfd-metrics-tls\") pod \"dns-default-5lb4v\" (UID: \"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd\") " pod="openshift-dns/dns-default-5lb4v" Apr 21 02:42:44.371813 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.371791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b386dadb-05b0-41e3-8db6-4a3771883f69-cert\") pod \"ingress-canary-pzlnx\" (UID: \"b386dadb-05b0-41e3-8db6-4a3771883f69\") " pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:42:44.382902 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.382877 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lm7rd\"" Apr 21 02:42:44.391612 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.391593 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pzlnx" Apr 21 02:42:44.661868 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.661839 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6wzzj\"" Apr 21 02:42:44.670486 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.670451 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5lb4v" Apr 21 02:42:44.810590 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.809466 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pzlnx"] Apr 21 02:42:44.814156 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:44.814093 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb386dadb_05b0_41e3_8db6_4a3771883f69.slice/crio-059cb6b76545b67b09b8d2d06199b859009e25159311e10486b910d33a830c20 WatchSource:0}: Error finding container 059cb6b76545b67b09b8d2d06199b859009e25159311e10486b910d33a830c20: Status 404 returned error can't find the container with id 059cb6b76545b67b09b8d2d06199b859009e25159311e10486b910d33a830c20 Apr 21 02:42:44.833298 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.833262 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5lb4v"] Apr 21 02:42:44.842377 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:44.842353 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d9a58cd_c703_4352_b873_ebc3e5cc1cfd.slice/crio-4e169fd069ab5a88172e5ed535078ce8222cfe2c8ad4c4c3b6f93194a2ab099e WatchSource:0}: Error finding container 4e169fd069ab5a88172e5ed535078ce8222cfe2c8ad4c4c3b6f93194a2ab099e: Status 404 returned error can't find the container with id 4e169fd069ab5a88172e5ed535078ce8222cfe2c8ad4c4c3b6f93194a2ab099e Apr 21 02:42:44.903793 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.903763 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5lb4v" event={"ID":"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd","Type":"ContainerStarted","Data":"4e169fd069ab5a88172e5ed535078ce8222cfe2c8ad4c4c3b6f93194a2ab099e"} Apr 21 02:42:44.906449 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.905672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" event={"ID":"8b753ead-78d6-4777-9a4f-dc30f2929420","Type":"ContainerStarted","Data":"3cc96922278b7472620c4d9da6c3548da3b0b89af4a6b676ffb4d20ed9939850"} Apr 21 02:42:44.906449 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.905704 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" event={"ID":"8b753ead-78d6-4777-9a4f-dc30f2929420","Type":"ContainerStarted","Data":"5c95f76c93b44ebe1c3e7394ffc82840bd5e094a76159c6bbb3774373332499b"} Apr 21 02:42:44.907646 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.907608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pzlnx" event={"ID":"b386dadb-05b0-41e3-8db6-4a3771883f69","Type":"ContainerStarted","Data":"059cb6b76545b67b09b8d2d06199b859009e25159311e10486b910d33a830c20"} Apr 21 02:42:44.921253 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:44.921117 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g6wpl" podStartSLOduration=16.411057642 podStartE2EDuration="17.921102157s" podCreationTimestamp="2026-04-21 02:42:27 +0000 UTC" firstStartedPulling="2026-04-21 02:42:43.219995785 +0000 UTC m=+96.153672839" lastFinishedPulling="2026-04-21 02:42:44.730040286 +0000 UTC m=+97.663717354" observedRunningTime="2026-04-21 02:42:44.920190364 +0000 UTC m=+97.853867452" watchObservedRunningTime="2026-04-21 02:42:44.921102157 +0000 UTC m=+97.854779236" Apr 21 02:42:47.920434 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:47.920392 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pzlnx" event={"ID":"b386dadb-05b0-41e3-8db6-4a3771883f69","Type":"ContainerStarted","Data":"1f80841b7f8e5c9ded8da410506f1b3f7bee5c2d87e5d3d8a01b04cc8538e761"} Apr 21 02:42:47.921919 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:47.921895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5lb4v" event={"ID":"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd","Type":"ContainerStarted","Data":"0b8b514c05bbe06c252a6e07bebf17bbdc5c9c009d43c9f263aea9bd888817b2"} Apr 21 02:42:47.922031 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:47.921923 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5lb4v" event={"ID":"7d9a58cd-c703-4352-b873-ebc3e5cc1cfd","Type":"ContainerStarted","Data":"e50cd96ac9cfaa13f18e549d6db5457cb625c2000d7e046944cc64f82aef95a7"} Apr 21 02:42:47.922081 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:47.922033 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5lb4v" Apr 21 02:42:47.934024 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:47.933986 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pzlnx" podStartSLOduration=65.818714543 podStartE2EDuration="1m7.933976111s" podCreationTimestamp="2026-04-21 02:41:40 +0000 UTC" firstStartedPulling="2026-04-21 02:42:44.816331461 +0000 UTC m=+97.750008518" lastFinishedPulling="2026-04-21 02:42:46.931593029 +0000 UTC m=+99.865270086" observedRunningTime="2026-04-21 02:42:47.933396542 +0000 UTC m=+100.867073617" watchObservedRunningTime="2026-04-21 02:42:47.933976111 +0000 UTC m=+100.867653186" Apr 21 02:42:47.947560 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:47.947521 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5lb4v" podStartSLOduration=65.862957772 podStartE2EDuration="1m7.947508654s" podCreationTimestamp="2026-04-21 02:41:40 +0000 UTC" firstStartedPulling="2026-04-21 02:42:44.843817512 +0000 UTC m=+97.777494584" lastFinishedPulling="2026-04-21 02:42:46.928368412 +0000 UTC m=+99.862045466" observedRunningTime="2026-04-21 02:42:47.946959744 +0000 UTC m=+100.880636841" watchObservedRunningTime="2026-04-21 02:42:47.947508654 +0000 UTC m=+100.881185729" Apr 21 02:42:48.617939 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:48.617913 2572 scope.go:117] "RemoveContainer" containerID="0d91188c85ac1e814dcb024ebbdb44533935d31d3ce8ff0202f82b0219492074" Apr 21 02:42:48.806183 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:48.806145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:48.806183 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:48.806185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:48.806938 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:48.806910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-service-ca-bundle\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:48.808508 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:48.808489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b-metrics-certs\") pod \"router-default-7bc899fb4-fbmpx\" (UID: \"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b\") " pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:48.926927 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:48.926906 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:42:48.927297 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:48.927019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" event={"ID":"9246de05-b865-4564-bf19-9e73a72a4969","Type":"ContainerStarted","Data":"b8ca7bc30e6dacd067feee41392dd2e1aa0f8d4df099de4dad9f3492f393670c"} Apr 21 02:42:48.927617 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:48.927598 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:48.942651 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:48.942611 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" podStartSLOduration=19.67509209 podStartE2EDuration="21.942598684s" podCreationTimestamp="2026-04-21 02:42:27 +0000 UTC" firstStartedPulling="2026-04-21 02:42:27.593987201 +0000 UTC m=+80.527664256" lastFinishedPulling="2026-04-21 02:42:29.861493793 +0000 UTC m=+82.795170850" observedRunningTime="2026-04-21 02:42:48.941554759 +0000 UTC m=+101.875231837" watchObservedRunningTime="2026-04-21 02:42:48.942598684 +0000 UTC m=+101.876275761" Apr 21 02:42:49.107877 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:49.107834 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:49.222516 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:49.222489 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bc899fb4-fbmpx"] Apr 21 02:42:49.225761 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:49.225734 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b0f00e_ebfb_4a3b_b87d_ccdff2c64c2b.slice/crio-60cc5cb814396c76d05acf72baa5fd267438abac977b95b7fc960d444e23814d WatchSource:0}: Error finding container 60cc5cb814396c76d05acf72baa5fd267438abac977b95b7fc960d444e23814d: Status 404 returned error can't find the container with id 60cc5cb814396c76d05acf72baa5fd267438abac977b95b7fc960d444e23814d Apr 21 02:42:49.668642 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:49.668615 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-8jgxr" Apr 21 02:42:49.838616 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:49.838584 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mxtcd" Apr 21 02:42:49.931134 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:49.931059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bc899fb4-fbmpx" event={"ID":"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b","Type":"ContainerStarted","Data":"8f491178f10ff2001e9a2631d2ab28245116e09a11c04dcf96543fb5b8ab14b7"} Apr 21 02:42:49.931134 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:49.931096 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bc899fb4-fbmpx" event={"ID":"c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b","Type":"ContainerStarted","Data":"60cc5cb814396c76d05acf72baa5fd267438abac977b95b7fc960d444e23814d"} Apr 21 02:42:49.947945 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:49.947901 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7bc899fb4-fbmpx" podStartSLOduration=33.947887637 podStartE2EDuration="33.947887637s" podCreationTimestamp="2026-04-21 02:42:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:42:49.946906909 +0000 UTC m=+102.880583997" watchObservedRunningTime="2026-04-21 02:42:49.947887637 +0000 UTC m=+102.881564713" Apr 21 02:42:50.108339 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:50.108298 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:50.110928 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:50.110902 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:50.936207 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:50.936167 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:50.937376 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:50.937357 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7bc899fb4-fbmpx" Apr 21 02:42:54.924363 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.924335 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bb69dcc45-djkbs"] Apr 21 02:42:54.930440 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.930414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:54.932728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.932707 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 02:42:54.932838 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.932747 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nlcjx\"" Apr 21 02:42:54.932838 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.932789 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 02:42:54.933561 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.933546 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 02:42:54.939597 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.939178 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 02:42:54.940961 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.940939 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bb69dcc45-djkbs"] Apr 21 02:42:54.989969 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.989939 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m"] Apr 21 02:42:54.992960 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.992944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" Apr 21 02:42:54.997519 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.997501 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-dkmgn\"" Apr 21 02:42:54.997665 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:54.997648 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 02:42:55.007872 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.007843 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m"] Apr 21 02:42:55.017914 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.017892 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2652q"] Apr 21 02:42:55.021132 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.021115 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.023932 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.023917 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 02:42:55.024405 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.024389 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s6bnd\"" Apr 21 02:42:55.024579 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.024567 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 02:42:55.024873 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.024861 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 02:42:55.026627 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.026610 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 02:42:55.040420 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.040401 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2652q"] Apr 21 02:42:55.053946 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.053924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-bound-sa-token\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.054026 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.053958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-image-registry-private-configuration\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.054026 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.053978 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-installation-pull-secrets\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.054026 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.054006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-registry-tls\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.054138 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.054069 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6kcm\" (UniqueName: \"kubernetes.io/projected/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-kube-api-access-m6kcm\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.054168 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.054150 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-trusted-ca\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.054200 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.054185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-ca-trust-extracted\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.054246 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.054207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-registry-certificates\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.155446 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-registry-certificates\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.155633 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fd0a03f0-ed39-4a9c-9384-114d2033c596-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-dbh5m\" (UID: \"fd0a03f0-ed39-4a9c-9384-114d2033c596\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" Apr 21 02:42:55.155633 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155516 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.155633 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155543 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l4np\" (UniqueName: \"kubernetes.io/projected/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-kube-api-access-7l4np\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.155633 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-bound-sa-token\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.155633 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-image-registry-private-configuration\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.155633 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155623 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-installation-pull-secrets\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.155995 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-registry-tls\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.155995 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6kcm\" (UniqueName: \"kubernetes.io/projected/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-kube-api-access-m6kcm\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.155995 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-data-volume\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.155995 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-trusted-ca\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.155995 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155795 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.155995 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-crio-socket\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.155995 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.155854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-ca-trust-extracted\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.156376 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.156197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-ca-trust-extracted\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.156479 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.156449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-registry-certificates\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.156774 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.156747 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-trusted-ca\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.158163 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.158142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-image-registry-private-configuration\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.158296 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.158254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-installation-pull-secrets\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.158353 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.158293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-registry-tls\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.163376 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.163354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-bound-sa-token\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.163475 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.163463 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6kcm\" (UniqueName: \"kubernetes.io/projected/515fdc09-dc3e-4b87-a3c5-8db3f15b342f-kube-api-access-m6kcm\") pod \"image-registry-7bb69dcc45-djkbs\" (UID: \"515fdc09-dc3e-4b87-a3c5-8db3f15b342f\") " pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.242444 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.242413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.256386 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.256353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-data-volume\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.256517 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.256404 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.256517 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.256436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-crio-socket\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.256517 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.256494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fd0a03f0-ed39-4a9c-9384-114d2033c596-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-dbh5m\" (UID: \"fd0a03f0-ed39-4a9c-9384-114d2033c596\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" Apr 21 02:42:55.256668 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.256528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.256668 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.256560 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l4np\" (UniqueName: \"kubernetes.io/projected/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-kube-api-access-7l4np\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.256833 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.256793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-data-volume\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.256925 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.256908 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-crio-socket\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.257367 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.257347 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.259143 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.259117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.259308 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.259290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fd0a03f0-ed39-4a9c-9384-114d2033c596-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-dbh5m\" (UID: \"fd0a03f0-ed39-4a9c-9384-114d2033c596\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" Apr 21 02:42:55.264160 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.264103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l4np\" (UniqueName: \"kubernetes.io/projected/dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f-kube-api-access-7l4np\") pod \"insights-runtime-extractor-2652q\" (UID: \"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f\") " pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.304373 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.303931 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" Apr 21 02:42:55.329337 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.329265 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2652q" Apr 21 02:42:55.378028 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.377973 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bb69dcc45-djkbs"] Apr 21 02:42:55.383993 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:55.383958 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod515fdc09_dc3e_4b87_a3c5_8db3f15b342f.slice/crio-583a10f5067789899da665d6b678e6489c9dc3301d7214a5db9cbbc6d01d1025 WatchSource:0}: Error finding container 583a10f5067789899da665d6b678e6489c9dc3301d7214a5db9cbbc6d01d1025: Status 404 returned error can't find the container with id 583a10f5067789899da665d6b678e6489c9dc3301d7214a5db9cbbc6d01d1025 Apr 21 02:42:55.449971 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.449940 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m"] Apr 21 02:42:55.452747 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:55.452624 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd0a03f0_ed39_4a9c_9384_114d2033c596.slice/crio-030701f16bb628b52af5ee78618ec9a1c2aba732c767a763be1ae64596ffc829 WatchSource:0}: Error finding container 030701f16bb628b52af5ee78618ec9a1c2aba732c767a763be1ae64596ffc829: Status 404 returned error can't find the container with id 030701f16bb628b52af5ee78618ec9a1c2aba732c767a763be1ae64596ffc829 Apr 21 02:42:55.474850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.474827 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2652q"] Apr 21 02:42:55.478393 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:42:55.478369 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddccfeac3_9f28_4ca2_9fc1_95181bb0ba0f.slice/crio-b2d569a61de9e65011e528e96140ac51d4987e0e66bbaf1c8c6b1ee630419084 WatchSource:0}: Error finding container b2d569a61de9e65011e528e96140ac51d4987e0e66bbaf1c8c6b1ee630419084: Status 404 returned error can't find the container with id b2d569a61de9e65011e528e96140ac51d4987e0e66bbaf1c8c6b1ee630419084 Apr 21 02:42:55.953283 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.952865 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" event={"ID":"515fdc09-dc3e-4b87-a3c5-8db3f15b342f","Type":"ContainerStarted","Data":"38659ca9ed0dc85ed097b7364a089b46963fe5bd23f5dde2ac679ccccf05def8"} Apr 21 02:42:55.953283 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.952906 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" event={"ID":"515fdc09-dc3e-4b87-a3c5-8db3f15b342f","Type":"ContainerStarted","Data":"583a10f5067789899da665d6b678e6489c9dc3301d7214a5db9cbbc6d01d1025"} Apr 21 02:42:55.953283 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.952950 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:42:55.955205 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.955082 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2652q" event={"ID":"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f","Type":"ContainerStarted","Data":"567f98f546d2b8e938b7b3172b97367f6bda5438a6d585acb27771230fafeb51"} Apr 21 02:42:55.955205 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.955117 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2652q" event={"ID":"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f","Type":"ContainerStarted","Data":"b2d569a61de9e65011e528e96140ac51d4987e0e66bbaf1c8c6b1ee630419084"} Apr 21 02:42:55.956568 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.956504 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" event={"ID":"fd0a03f0-ed39-4a9c-9384-114d2033c596","Type":"ContainerStarted","Data":"030701f16bb628b52af5ee78618ec9a1c2aba732c767a763be1ae64596ffc829"} Apr 21 02:42:55.970158 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:55.970120 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" podStartSLOduration=1.970106825 podStartE2EDuration="1.970106825s" podCreationTimestamp="2026-04-21 02:42:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:42:55.969181444 +0000 UTC m=+108.902858520" watchObservedRunningTime="2026-04-21 02:42:55.970106825 +0000 UTC m=+108.903783936" Apr 21 02:42:56.961096 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:56.961052 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2652q" event={"ID":"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f","Type":"ContainerStarted","Data":"4d7c763f82a48da41cd32a57a50282dbf523ad9e5c04b17ab1d456284ae80bd2"} Apr 21 02:42:56.962574 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:56.962544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" event={"ID":"fd0a03f0-ed39-4a9c-9384-114d2033c596","Type":"ContainerStarted","Data":"4b659a0a985719d310b52eb1219a08752237767ea830180cb200b33b3c48cc6c"} Apr 21 02:42:56.962726 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:56.962699 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" Apr 21 02:42:56.968807 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:56.968777 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" Apr 21 02:42:56.976116 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:56.976070 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-dbh5m" podStartSLOduration=2.006288875 podStartE2EDuration="2.976058912s" podCreationTimestamp="2026-04-21 02:42:54 +0000 UTC" firstStartedPulling="2026-04-21 02:42:55.454936821 +0000 UTC m=+108.388613876" lastFinishedPulling="2026-04-21 02:42:56.424706844 +0000 UTC m=+109.358383913" observedRunningTime="2026-04-21 02:42:56.975412392 +0000 UTC m=+109.909089474" watchObservedRunningTime="2026-04-21 02:42:56.976058912 +0000 UTC m=+109.909735988" Apr 21 02:42:57.929978 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:57.929948 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5lb4v" Apr 21 02:42:57.968040 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:57.968006 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2652q" event={"ID":"dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f","Type":"ContainerStarted","Data":"c603578a6bda50c8712a5658dc7824a6b875f285ed90b11eda2625e8c0e68a75"} Apr 21 02:42:57.988521 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:42:57.988431 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2652q" podStartSLOduration=1.766080283 podStartE2EDuration="3.988410155s" podCreationTimestamp="2026-04-21 02:42:54 +0000 UTC" firstStartedPulling="2026-04-21 02:42:55.528472731 +0000 UTC m=+108.462149791" lastFinishedPulling="2026-04-21 02:42:57.750802609 +0000 UTC m=+110.684479663" observedRunningTime="2026-04-21 02:42:57.987777608 +0000 UTC m=+110.921454689" watchObservedRunningTime="2026-04-21 02:42:57.988410155 +0000 UTC m=+110.922087232" Apr 21 02:43:01.934276 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:01.934227 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9kctj"] Apr 21 02:43:01.939590 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:01.939566 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:01.941859 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:01.941835 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-blx9s\"" Apr 21 02:43:01.942468 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:01.942448 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 02:43:01.942620 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:01.942514 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 02:43:01.942620 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:01.942603 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 02:43:01.942794 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:01.942626 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 02:43:01.942794 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:01.942550 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 02:43:01.942794 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:01.942551 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 02:43:02.109955 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.109927 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a67cba2-1302-4cf5-a038-09168abcdd03-metrics-client-ca\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.109955 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.109960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.110159 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.109979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a67cba2-1302-4cf5-a038-09168abcdd03-root\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.110159 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.109997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-tls\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.110159 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.110076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-textfile\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.110159 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.110114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw9d8\" (UniqueName: \"kubernetes.io/projected/0a67cba2-1302-4cf5-a038-09168abcdd03-kube-api-access-mw9d8\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.110159 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.110142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a67cba2-1302-4cf5-a038-09168abcdd03-sys\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.110159 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.110159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-accelerators-collector-config\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.110352 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.110188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-wtmp\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.211576 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211496 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a67cba2-1302-4cf5-a038-09168abcdd03-metrics-client-ca\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.211576 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.211777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a67cba2-1302-4cf5-a038-09168abcdd03-root\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.211777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211602 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-tls\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.211777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-textfile\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.211777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211686 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw9d8\" (UniqueName: \"kubernetes.io/projected/0a67cba2-1302-4cf5-a038-09168abcdd03-kube-api-access-mw9d8\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.211777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a67cba2-1302-4cf5-a038-09168abcdd03-sys\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.211777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-accelerators-collector-config\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.212059 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-wtmp\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.212059 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211822 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a67cba2-1302-4cf5-a038-09168abcdd03-sys\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.212059 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.211938 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-wtmp\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.212203 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.212183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a67cba2-1302-4cf5-a038-09168abcdd03-metrics-client-ca\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.212467 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.212441 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-accelerators-collector-config\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.212562 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.212474 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-textfile\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.212562 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.212508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a67cba2-1302-4cf5-a038-09168abcdd03-root\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.214648 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.214627 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.214797 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.214769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a67cba2-1302-4cf5-a038-09168abcdd03-node-exporter-tls\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.219081 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.219057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw9d8\" (UniqueName: \"kubernetes.io/projected/0a67cba2-1302-4cf5-a038-09168abcdd03-kube-api-access-mw9d8\") pod \"node-exporter-9kctj\" (UID: \"0a67cba2-1302-4cf5-a038-09168abcdd03\") " pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.250054 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.250025 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9kctj" Apr 21 02:43:02.258697 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:43:02.258674 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a67cba2_1302_4cf5_a038_09168abcdd03.slice/crio-0b1d60eb6e2f6acd5cf77cb67ee592aefe1f119edfe714a554c0a72009bd7674 WatchSource:0}: Error finding container 0b1d60eb6e2f6acd5cf77cb67ee592aefe1f119edfe714a554c0a72009bd7674: Status 404 returned error can't find the container with id 0b1d60eb6e2f6acd5cf77cb67ee592aefe1f119edfe714a554c0a72009bd7674 Apr 21 02:43:02.946473 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.946442 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 02:43:02.951228 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.951207 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:02.953904 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.953576 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 02:43:02.953904 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.953649 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 02:43:02.953904 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.953662 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 02:43:02.953904 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.953742 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 02:43:02.953904 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.953840 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 02:43:02.954272 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.953963 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 02:43:02.954272 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.954032 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-qxlzm\"" Apr 21 02:43:02.954272 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.954063 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 02:43:02.954272 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.954122 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 02:43:02.954272 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.954135 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 02:43:02.964219 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.964196 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 02:43:02.985621 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:02.985586 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kctj" event={"ID":"0a67cba2-1302-4cf5-a038-09168abcdd03","Type":"ContainerStarted","Data":"0b1d60eb6e2f6acd5cf77cb67ee592aefe1f119edfe714a554c0a72009bd7674"} Apr 21 02:43:03.119120 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119225 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119133 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119225 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119350 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119250 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-web-config\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119350 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-config-volume\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119350 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-tls-assets\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119485 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-config-out\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119485 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppkmc\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-kube-api-access-ppkmc\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119485 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119436 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119621 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119487 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119621 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119621 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119555 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.119621 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.119588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.220976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.220470 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-config-volume\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.223556 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.223501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-config-volume\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.223795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-tls-assets\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.223878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-config-out\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.223929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkmc\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-kube-api-access-ppkmc\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.223967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.224036 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.224078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.224114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.224170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.224219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.224270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.224300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.224728 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.224338 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-web-config\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.225395 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.225133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.226144 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.226085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.226725 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.226696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.228022 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.227534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-config-out\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.228022 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.227612 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-web-config\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.228174 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.228125 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.228698 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.228653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-tls-assets\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.229011 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.228984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.229148 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.229129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.229300 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.229277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.229991 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.229966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.237686 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.237666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppkmc\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-kube-api-access-ppkmc\") pod \"alertmanager-main-0\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.263543 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.263519 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:43:03.381139 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.381112 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 02:43:03.384110 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:43:03.384085 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod704a669d_91d3_46df_bdd7_8bb9ac616307.slice/crio-61911936d342787d1e63a5b8436656b735ec1a21b63c274359ae0d84918485cb WatchSource:0}: Error finding container 61911936d342787d1e63a5b8436656b735ec1a21b63c274359ae0d84918485cb: Status 404 returned error can't find the container with id 61911936d342787d1e63a5b8436656b735ec1a21b63c274359ae0d84918485cb Apr 21 02:43:03.990289 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.990252 2572 generic.go:358] "Generic (PLEG): container finished" podID="0a67cba2-1302-4cf5-a038-09168abcdd03" containerID="dc822454d48652fb2f872228bd5b5da08a9f5f4e48316d3532f028b956d4b34b" exitCode=0 Apr 21 02:43:03.990755 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.990335 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kctj" event={"ID":"0a67cba2-1302-4cf5-a038-09168abcdd03","Type":"ContainerDied","Data":"dc822454d48652fb2f872228bd5b5da08a9f5f4e48316d3532f028b956d4b34b"} Apr 21 02:43:03.991542 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:03.991508 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerStarted","Data":"61911936d342787d1e63a5b8436656b735ec1a21b63c274359ae0d84918485cb"} Apr 21 02:43:04.850963 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.850878 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5c454b744c-66tqf"] Apr 21 02:43:04.854498 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.854480 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:04.857813 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.857783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 02:43:04.857977 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.857783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-cloq0fjau410a\"" Apr 21 02:43:04.857977 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.857872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 02:43:04.857977 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.857885 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 02:43:04.858187 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.858168 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 02:43:04.858590 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.858448 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 02:43:04.858590 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.858473 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-c4dp8\"" Apr 21 02:43:04.873954 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.873931 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c454b744c-66tqf"] Apr 21 02:43:04.996420 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.996385 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kctj" event={"ID":"0a67cba2-1302-4cf5-a038-09168abcdd03","Type":"ContainerStarted","Data":"267871ac6012f1660a6446c8670e7e6dcec44463dd93bc2279403aa023f65794"} Apr 21 02:43:04.996894 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.996429 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kctj" event={"ID":"0a67cba2-1302-4cf5-a038-09168abcdd03","Type":"ContainerStarted","Data":"19bea4dd4c70bcf320186d897c8d71253df1aa2f0a38bf77d2d927e68c85cd60"} Apr 21 02:43:04.997822 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.997798 2572 generic.go:358] "Generic (PLEG): container finished" podID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerID="d8b1c0e2a824024b64838116a317b89ce54455c17af0fe4157fbf3f530c84125" exitCode=0 Apr 21 02:43:04.997926 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:04.997852 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerDied","Data":"d8b1c0e2a824024b64838116a317b89ce54455c17af0fe4157fbf3f530c84125"} Apr 21 02:43:05.014164 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.014109 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9kctj" podStartSLOduration=3.231214723 podStartE2EDuration="4.014091187s" podCreationTimestamp="2026-04-21 02:43:01 +0000 UTC" firstStartedPulling="2026-04-21 02:43:02.261113979 +0000 UTC m=+115.194791046" lastFinishedPulling="2026-04-21 02:43:03.043990446 +0000 UTC m=+115.977667510" observedRunningTime="2026-04-21 02:43:05.013248393 +0000 UTC m=+117.946925464" watchObservedRunningTime="2026-04-21 02:43:05.014091187 +0000 UTC m=+117.947768266" Apr 21 02:43:05.041551 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.041512 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-grpc-tls\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.041759 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.041563 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.041759 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.041669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-tls\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.041759 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.041740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4f9224e-a677-484e-bef8-d062d8fca5c5-metrics-client-ca\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.041918 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.041813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.041918 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.041839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.041918 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.041875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.041918 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.041901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brn7d\" (UniqueName: \"kubernetes.io/projected/e4f9224e-a677-484e-bef8-d062d8fca5c5-kube-api-access-brn7d\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.142782 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.142750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4f9224e-a677-484e-bef8-d062d8fca5c5-metrics-client-ca\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.142935 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.142889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.142973 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.142934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.143025 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.142975 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.143025 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.143007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brn7d\" (UniqueName: \"kubernetes.io/projected/e4f9224e-a677-484e-bef8-d062d8fca5c5-kube-api-access-brn7d\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.143689 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.143293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-grpc-tls\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.143689 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.143414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.143689 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.143573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-tls\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.143689 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.143588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4f9224e-a677-484e-bef8-d062d8fca5c5-metrics-client-ca\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.145969 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.145945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.146087 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.146066 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.146163 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.146140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.146227 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.146192 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-grpc-tls\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.146431 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.146412 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-tls\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.146487 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.146446 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4f9224e-a677-484e-bef8-d062d8fca5c5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.150893 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.150875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brn7d\" (UniqueName: \"kubernetes.io/projected/e4f9224e-a677-484e-bef8-d062d8fca5c5-kube-api-access-brn7d\") pod \"thanos-querier-5c454b744c-66tqf\" (UID: \"e4f9224e-a677-484e-bef8-d062d8fca5c5\") " pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.165884 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.165854 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:05.287359 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:05.287326 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c454b744c-66tqf"] Apr 21 02:43:05.290405 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:43:05.290373 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4f9224e_a677_484e_bef8_d062d8fca5c5.slice/crio-81f32d447e51914dcd5a2fc8db2af009fc7fdbf48b6367692ef154ddc62a83ce WatchSource:0}: Error finding container 81f32d447e51914dcd5a2fc8db2af009fc7fdbf48b6367692ef154ddc62a83ce: Status 404 returned error can't find the container with id 81f32d447e51914dcd5a2fc8db2af009fc7fdbf48b6367692ef154ddc62a83ce Apr 21 02:43:06.002908 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:06.002856 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" event={"ID":"e4f9224e-a677-484e-bef8-d062d8fca5c5","Type":"ContainerStarted","Data":"81f32d447e51914dcd5a2fc8db2af009fc7fdbf48b6367692ef154ddc62a83ce"} Apr 21 02:43:07.008871 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:07.008833 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerStarted","Data":"a74a46b23672df3931e3be7d021d5905409b43f24ca4576035540ca2c1c39db8"} Apr 21 02:43:07.008871 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:07.008875 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerStarted","Data":"f6db709bf80bb0e92479e700c6f04194167c2310c6211c7b08ce0bbd21356551"} Apr 21 02:43:07.009336 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:07.008888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerStarted","Data":"ea17287f2185aa63102283695abb88371705086a5e44549b1943b01462642a96"} Apr 21 02:43:07.009336 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:07.008899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerStarted","Data":"9375f6035e29614aa7790b974420cffa4f7989c479c524bec551b75b2aed998f"} Apr 21 02:43:08.015060 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.015028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerStarted","Data":"4fe91e290c8cac0c942f04963e911a68282011e633a7a221e1e0d326c332ff11"} Apr 21 02:43:08.015439 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.015067 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerStarted","Data":"72895e3e9827c5d6b54be58f0e89feb618da5501c624f1bfb8e26ed6f0d5b652"} Apr 21 02:43:08.017482 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.017456 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" event={"ID":"e4f9224e-a677-484e-bef8-d062d8fca5c5","Type":"ContainerStarted","Data":"d8cd5e327a91b293fc30d4a0bfb0d03878349650dbf4ea2e6427dfa48b69989d"} Apr 21 02:43:08.017586 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.017487 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" event={"ID":"e4f9224e-a677-484e-bef8-d062d8fca5c5","Type":"ContainerStarted","Data":"dac858bd05b749a120bb0a65dab9368c1e08b592d49fd2b4589c5e88afcbf5d2"} Apr 21 02:43:08.017586 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.017497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" event={"ID":"e4f9224e-a677-484e-bef8-d062d8fca5c5","Type":"ContainerStarted","Data":"286f65fcf6df17cbcdbb55cf579b976b0fc2479e28e43300db809155557ae698"} Apr 21 02:43:08.017586 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.017507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" event={"ID":"e4f9224e-a677-484e-bef8-d062d8fca5c5","Type":"ContainerStarted","Data":"8b77edc187a8832e9c1fa31091d167d9e8ef07211a46114b9fdf82656b0f20de"} Apr 21 02:43:08.017586 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.017518 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" event={"ID":"e4f9224e-a677-484e-bef8-d062d8fca5c5","Type":"ContainerStarted","Data":"5da7e3764cc4e67d8e8b8176bc344244972067c1014427ad4b46c96c71dbb662"} Apr 21 02:43:08.038331 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.038252 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.5557069289999998 podStartE2EDuration="6.038205777s" podCreationTimestamp="2026-04-21 02:43:02 +0000 UTC" firstStartedPulling="2026-04-21 02:43:03.38607182 +0000 UTC m=+116.319748889" lastFinishedPulling="2026-04-21 02:43:07.86857068 +0000 UTC m=+120.802247737" observedRunningTime="2026-04-21 02:43:08.036846787 +0000 UTC m=+120.970523897" watchObservedRunningTime="2026-04-21 02:43:08.038205777 +0000 UTC m=+120.971882854" Apr 21 02:43:08.049650 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.049623 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 02:43:08.054471 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.054447 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.057089 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.056850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 02:43:08.057089 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.056894 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-fv883225ab67k\"" Apr 21 02:43:08.057089 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.056850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 02:43:08.057089 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.056977 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-pv999\"" Apr 21 02:43:08.057089 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.056983 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 02:43:08.057089 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.057051 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 02:43:08.057800 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.057154 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 02:43:08.057800 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.057428 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 02:43:08.057800 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.057491 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 02:43:08.057800 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.057656 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 02:43:08.057800 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.057717 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 02:43:08.058201 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.057929 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 02:43:08.058534 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.058513 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 02:43:08.059299 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.058807 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 02:43:08.060567 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.060547 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 02:43:08.066114 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.065742 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 02:43:08.068145 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.067427 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068145 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.067481 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-config\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068145 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.067508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068145 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.067534 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068145 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.067615 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068478 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.067702 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068478 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.068324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068478 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.068360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068478 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.068385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mkw8\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-kube-api-access-7mkw8\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068659 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.068518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068659 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.068614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-web-config\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068659 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.068636 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068798 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.068678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068798 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.068722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.068798 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.068773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.069051 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.069032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-config-out\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.069123 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.069067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.069123 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.069103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170347 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170315 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170530 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170530 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170530 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-config-out\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170530 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170530 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170530 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170530 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170514 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-config\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170560 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170644 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170681 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170731 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mkw8\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-kube-api-access-7mkw8\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170759 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-web-config\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.170911 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.170837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.171453 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.171180 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.171453 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.171365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.171860 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.171829 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.172565 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.172526 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.173668 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.173634 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.173668 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.173651 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.173830 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.173809 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.174260 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.174102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.174260 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.174125 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.174260 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.174193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-config-out\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.174260 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.174214 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-config\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.174485 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.174301 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.175627 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.175603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.176262 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.176220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.176404 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.176386 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-web-config\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.176512 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.176493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.176581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.176565 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.178182 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.178164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mkw8\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-kube-api-access-7mkw8\") pod \"prometheus-k8s-0\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.368488 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.368460 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:08.489909 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:08.489784 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 02:43:08.492266 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:43:08.492225 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd652074_8275_40fb_aece_61b3ca615da7.slice/crio-55703bebc14919a54a9bcaad9b325b60c941463eeb687643ea893d07135ce44c WatchSource:0}: Error finding container 55703bebc14919a54a9bcaad9b325b60c941463eeb687643ea893d07135ce44c: Status 404 returned error can't find the container with id 55703bebc14919a54a9bcaad9b325b60c941463eeb687643ea893d07135ce44c Apr 21 02:43:09.022899 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:09.022857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" event={"ID":"e4f9224e-a677-484e-bef8-d062d8fca5c5","Type":"ContainerStarted","Data":"8e2ff504f434e94062b987a520a5445058a1a173b68b709f53a8b51e610b6398"} Apr 21 02:43:09.023405 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:09.023057 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:09.024282 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:09.024257 2572 generic.go:358] "Generic (PLEG): container finished" podID="dd652074-8275-40fb-aece-61b3ca615da7" containerID="c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39" exitCode=0 Apr 21 02:43:09.024417 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:09.024332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerDied","Data":"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39"} Apr 21 02:43:09.024417 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:09.024355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerStarted","Data":"55703bebc14919a54a9bcaad9b325b60c941463eeb687643ea893d07135ce44c"} Apr 21 02:43:09.042582 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:09.042543 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" podStartSLOduration=2.465821567 podStartE2EDuration="5.042530856s" podCreationTimestamp="2026-04-21 02:43:04 +0000 UTC" firstStartedPulling="2026-04-21 02:43:05.292330063 +0000 UTC m=+118.226007116" lastFinishedPulling="2026-04-21 02:43:07.869039351 +0000 UTC m=+120.802716405" observedRunningTime="2026-04-21 02:43:09.041173117 +0000 UTC m=+121.974850192" watchObservedRunningTime="2026-04-21 02:43:09.042530856 +0000 UTC m=+121.976207964" Apr 21 02:43:12.039466 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:12.039430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerStarted","Data":"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf"} Apr 21 02:43:12.039799 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:12.039475 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerStarted","Data":"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c"} Apr 21 02:43:12.039799 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:12.039488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerStarted","Data":"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f"} Apr 21 02:43:12.039799 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:12.039500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerStarted","Data":"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872"} Apr 21 02:43:12.039799 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:12.039511 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerStarted","Data":"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437"} Apr 21 02:43:13.045423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:13.045390 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerStarted","Data":"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae"} Apr 21 02:43:13.070617 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:13.070558 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.439421483 podStartE2EDuration="5.070542922s" podCreationTimestamp="2026-04-21 02:43:08 +0000 UTC" firstStartedPulling="2026-04-21 02:43:09.025415741 +0000 UTC m=+121.959092794" lastFinishedPulling="2026-04-21 02:43:11.656537178 +0000 UTC m=+124.590214233" observedRunningTime="2026-04-21 02:43:13.068746465 +0000 UTC m=+126.002423547" watchObservedRunningTime="2026-04-21 02:43:13.070542922 +0000 UTC m=+126.004220210" Apr 21 02:43:13.368971 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:13.368887 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:43:15.034813 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:15.034785 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5c454b744c-66tqf" Apr 21 02:43:16.967439 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:16.967410 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7bb69dcc45-djkbs" Apr 21 02:43:17.362937 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:17.362854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:43:17.365144 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:17.365109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3ca7174-0a17-4896-b723-717a079d23e3-metrics-certs\") pod \"network-metrics-daemon-2f9pd\" (UID: \"f3ca7174-0a17-4896-b723-717a079d23e3\") " pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:43:17.640400 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:17.640372 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j2fpz\"" Apr 21 02:43:17.648641 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:17.648614 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f9pd" Apr 21 02:43:17.767859 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:17.767827 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2f9pd"] Apr 21 02:43:17.771201 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:43:17.771168 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3ca7174_0a17_4896_b723_717a079d23e3.slice/crio-9e48317e7b96993c2e80d2df60b618252c9e938e3e04ba02459c7e80e89e2655 WatchSource:0}: Error finding container 9e48317e7b96993c2e80d2df60b618252c9e938e3e04ba02459c7e80e89e2655: Status 404 returned error can't find the container with id 9e48317e7b96993c2e80d2df60b618252c9e938e3e04ba02459c7e80e89e2655 Apr 21 02:43:18.065117 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:18.065025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2f9pd" event={"ID":"f3ca7174-0a17-4896-b723-717a079d23e3","Type":"ContainerStarted","Data":"9e48317e7b96993c2e80d2df60b618252c9e938e3e04ba02459c7e80e89e2655"} Apr 21 02:43:19.069276 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:19.069178 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2f9pd" event={"ID":"f3ca7174-0a17-4896-b723-717a079d23e3","Type":"ContainerStarted","Data":"5bace8f20c62f27bf75156689f552bcba2f4aa84a6f5c84052816d2ca90ff51d"} Apr 21 02:43:19.069276 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:19.069212 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2f9pd" event={"ID":"f3ca7174-0a17-4896-b723-717a079d23e3","Type":"ContainerStarted","Data":"a2f81f14ac2919dca0ad46a372aba12a2f6281c11e8f44b16cd63baf9b40a442"} Apr 21 02:43:19.083357 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:43:19.083311 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2f9pd" podStartSLOduration=131.207551198 podStartE2EDuration="2m12.083297201s" podCreationTimestamp="2026-04-21 02:41:07 +0000 UTC" firstStartedPulling="2026-04-21 02:43:17.773019911 +0000 UTC m=+130.706696984" lastFinishedPulling="2026-04-21 02:43:18.648765933 +0000 UTC m=+131.582442987" observedRunningTime="2026-04-21 02:43:19.082442502 +0000 UTC m=+132.016119578" watchObservedRunningTime="2026-04-21 02:43:19.083297201 +0000 UTC m=+132.016974277" Apr 21 02:44:01.195318 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:01.195286 2572 generic.go:358] "Generic (PLEG): container finished" podID="b7aaabb9-fee1-4cd6-9c82-badd547250ae" containerID="dc90b891be2dd5d3644a93cb3142bebab437bcaf87f720dfaa8ae510ec9b095b" exitCode=0 Apr 21 02:44:01.195725 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:01.195359 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" event={"ID":"b7aaabb9-fee1-4cd6-9c82-badd547250ae","Type":"ContainerDied","Data":"dc90b891be2dd5d3644a93cb3142bebab437bcaf87f720dfaa8ae510ec9b095b"} Apr 21 02:44:01.195725 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:01.195695 2572 scope.go:117] "RemoveContainer" containerID="dc90b891be2dd5d3644a93cb3142bebab437bcaf87f720dfaa8ae510ec9b095b" Apr 21 02:44:02.200162 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:02.200121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z9799" event={"ID":"b7aaabb9-fee1-4cd6-9c82-badd547250ae","Type":"ContainerStarted","Data":"438f590a4436a1e6466d0433a26533af8c851fed2e831e318187b08a7586cb2e"} Apr 21 02:44:08.369177 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:08.369139 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:08.384457 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:08.384435 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:09.235084 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:09.235056 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:22.119809 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.119772 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 02:44:22.120447 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.120373 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="alertmanager" containerID="cri-o://9375f6035e29614aa7790b974420cffa4f7989c479c524bec551b75b2aed998f" gracePeriod=120 Apr 21 02:44:22.120580 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.120451 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy-metric" containerID="cri-o://72895e3e9827c5d6b54be58f0e89feb618da5501c624f1bfb8e26ed6f0d5b652" gracePeriod=120 Apr 21 02:44:22.120580 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.120478 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy-web" containerID="cri-o://f6db709bf80bb0e92479e700c6f04194167c2310c6211c7b08ce0bbd21356551" gracePeriod=120 Apr 21 02:44:22.120580 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.120526 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy" containerID="cri-o://a74a46b23672df3931e3be7d021d5905409b43f24ca4576035540ca2c1c39db8" gracePeriod=120 Apr 21 02:44:22.120580 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.120486 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="prom-label-proxy" containerID="cri-o://4fe91e290c8cac0c942f04963e911a68282011e633a7a221e1e0d326c332ff11" gracePeriod=120 Apr 21 02:44:22.120580 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.120565 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="config-reloader" containerID="cri-o://ea17287f2185aa63102283695abb88371705086a5e44549b1943b01462642a96" gracePeriod=120 Apr 21 02:44:22.259339 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259303 2572 generic.go:358] "Generic (PLEG): container finished" podID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerID="4fe91e290c8cac0c942f04963e911a68282011e633a7a221e1e0d326c332ff11" exitCode=0 Apr 21 02:44:22.259339 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259330 2572 generic.go:358] "Generic (PLEG): container finished" podID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerID="72895e3e9827c5d6b54be58f0e89feb618da5501c624f1bfb8e26ed6f0d5b652" exitCode=0 Apr 21 02:44:22.259339 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259339 2572 generic.go:358] "Generic (PLEG): container finished" podID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerID="a74a46b23672df3931e3be7d021d5905409b43f24ca4576035540ca2c1c39db8" exitCode=0 Apr 21 02:44:22.259339 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259346 2572 generic.go:358] "Generic (PLEG): container finished" podID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerID="ea17287f2185aa63102283695abb88371705086a5e44549b1943b01462642a96" exitCode=0 Apr 21 02:44:22.259589 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259354 2572 generic.go:358] "Generic (PLEG): container finished" podID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerID="9375f6035e29614aa7790b974420cffa4f7989c479c524bec551b75b2aed998f" exitCode=0 Apr 21 02:44:22.259589 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerDied","Data":"4fe91e290c8cac0c942f04963e911a68282011e633a7a221e1e0d326c332ff11"} Apr 21 02:44:22.259589 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerDied","Data":"72895e3e9827c5d6b54be58f0e89feb618da5501c624f1bfb8e26ed6f0d5b652"} Apr 21 02:44:22.259589 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerDied","Data":"a74a46b23672df3931e3be7d021d5905409b43f24ca4576035540ca2c1c39db8"} Apr 21 02:44:22.259589 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259424 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerDied","Data":"ea17287f2185aa63102283695abb88371705086a5e44549b1943b01462642a96"} Apr 21 02:44:22.259589 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:22.259432 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerDied","Data":"9375f6035e29614aa7790b974420cffa4f7989c479c524bec551b75b2aed998f"} Apr 21 02:44:23.266054 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.266024 2572 generic.go:358] "Generic (PLEG): container finished" podID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerID="f6db709bf80bb0e92479e700c6f04194167c2310c6211c7b08ce0bbd21356551" exitCode=0 Apr 21 02:44:23.266420 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.266077 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerDied","Data":"f6db709bf80bb0e92479e700c6f04194167c2310c6211c7b08ce0bbd21356551"} Apr 21 02:44:23.363463 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.363439 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:23.538423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538346 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-main-db\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.538423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538386 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-web-config\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.538423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538410 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppkmc\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-kube-api-access-ppkmc\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.538703 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538432 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-cluster-tls-config\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.538703 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538453 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.538703 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538594 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-metric\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.538703 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538642 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-trusted-ca-bundle\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.538703 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538671 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-web\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.538703 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538686 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:44:23.538703 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538704 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-main-tls\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.539047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538761 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-config-volume\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.539047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538813 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-config-out\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.539047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538837 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-metrics-client-ca\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.539047 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.538881 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-tls-assets\") pod \"704a669d-91d3-46df-bdd7-8bb9ac616307\" (UID: \"704a669d-91d3-46df-bdd7-8bb9ac616307\") " Apr 21 02:44:23.539274 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.539203 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-main-db\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.539372 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.539342 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:44:23.540860 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.540537 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:44:23.541454 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.541410 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-kube-api-access-ppkmc" (OuterVolumeSpecName: "kube-api-access-ppkmc") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "kube-api-access-ppkmc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:44:23.541899 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.541853 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:23.542077 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.542051 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:44:23.542686 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.542661 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:23.542931 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.542907 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:23.543160 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.543140 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-config-volume" (OuterVolumeSpecName: "config-volume") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:23.543342 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.543320 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-config-out" (OuterVolumeSpecName: "config-out") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:44:23.543397 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.543358 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:23.546469 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.546379 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:23.552827 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.552807 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-web-config" (OuterVolumeSpecName: "web-config") pod "704a669d-91d3-46df-bdd7-8bb9ac616307" (UID: "704a669d-91d3-46df-bdd7-8bb9ac616307"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:23.640008 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.639981 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640008 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640005 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640016 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640025 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640035 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-secret-alertmanager-main-tls\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640048 2572 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-config-volume\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640057 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/704a669d-91d3-46df-bdd7-8bb9ac616307-config-out\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640066 2572 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/704a669d-91d3-46df-bdd7-8bb9ac616307-metrics-client-ca\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640074 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-tls-assets\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640083 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-web-config\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640090 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ppkmc\" (UniqueName: \"kubernetes.io/projected/704a669d-91d3-46df-bdd7-8bb9ac616307-kube-api-access-ppkmc\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:23.640151 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:23.640100 2572 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/704a669d-91d3-46df-bdd7-8bb9ac616307-cluster-tls-config\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:24.271817 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.271779 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"704a669d-91d3-46df-bdd7-8bb9ac616307","Type":"ContainerDied","Data":"61911936d342787d1e63a5b8436656b735ec1a21b63c274359ae0d84918485cb"} Apr 21 02:44:24.271817 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.271822 2572 scope.go:117] "RemoveContainer" containerID="4fe91e290c8cac0c942f04963e911a68282011e633a7a221e1e0d326c332ff11" Apr 21 02:44:24.272285 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.271864 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.278830 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.278813 2572 scope.go:117] "RemoveContainer" containerID="72895e3e9827c5d6b54be58f0e89feb618da5501c624f1bfb8e26ed6f0d5b652" Apr 21 02:44:24.285503 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.285478 2572 scope.go:117] "RemoveContainer" containerID="a74a46b23672df3931e3be7d021d5905409b43f24ca4576035540ca2c1c39db8" Apr 21 02:44:24.292134 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.292104 2572 scope.go:117] "RemoveContainer" containerID="f6db709bf80bb0e92479e700c6f04194167c2310c6211c7b08ce0bbd21356551" Apr 21 02:44:24.292544 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.292521 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 02:44:24.296417 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.296396 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 02:44:24.299277 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.299261 2572 scope.go:117] "RemoveContainer" containerID="ea17287f2185aa63102283695abb88371705086a5e44549b1943b01462642a96" Apr 21 02:44:24.305365 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.305351 2572 scope.go:117] "RemoveContainer" containerID="9375f6035e29614aa7790b974420cffa4f7989c479c524bec551b75b2aed998f" Apr 21 02:44:24.311460 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.311444 2572 scope.go:117] "RemoveContainer" containerID="d8b1c0e2a824024b64838116a317b89ce54455c17af0fe4157fbf3f530c84125" Apr 21 02:44:24.323914 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.323893 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 02:44:24.324198 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324186 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy" Apr 21 02:44:24.324271 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324211 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy" Apr 21 02:44:24.324271 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324224 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="alertmanager" Apr 21 02:44:24.324340 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324230 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="alertmanager" Apr 21 02:44:24.324340 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324329 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="config-reloader" Apr 21 02:44:24.324340 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324338 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="config-reloader" Apr 21 02:44:24.324423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324348 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy-web" Apr 21 02:44:24.324423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324354 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy-web" Apr 21 02:44:24.324423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324364 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="prom-label-proxy" Apr 21 02:44:24.324423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324373 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="prom-label-proxy" Apr 21 02:44:24.324423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324385 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="init-config-reloader" Apr 21 02:44:24.324423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324391 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="init-config-reloader" Apr 21 02:44:24.324423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324399 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy-metric" Apr 21 02:44:24.324423 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324404 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy-metric" Apr 21 02:44:24.324640 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324460 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy-metric" Apr 21 02:44:24.324640 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324468 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="prom-label-proxy" Apr 21 02:44:24.324640 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324476 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy-web" Apr 21 02:44:24.324640 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324482 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="alertmanager" Apr 21 02:44:24.324640 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324489 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="config-reloader" Apr 21 02:44:24.324640 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.324496 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" containerName="kube-rbac-proxy" Apr 21 02:44:24.329495 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.329479 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.331606 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.331588 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 02:44:24.331705 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.331620 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 02:44:24.331705 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.331629 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 02:44:24.331705 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.331620 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 02:44:24.331990 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.331975 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 02:44:24.332062 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.331998 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 02:44:24.332116 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.332098 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-qxlzm\"" Apr 21 02:44:24.332176 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.332138 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 02:44:24.332527 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.332510 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 02:44:24.336948 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.336923 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 02:44:24.340098 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.340078 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 02:44:24.447101 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447025 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eed554f9-0494-4890-85cd-17e7b666d556-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447101 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447062 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9nj2\" (UniqueName: \"kubernetes.io/projected/eed554f9-0494-4890-85cd-17e7b666d556-kube-api-access-n9nj2\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447101 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eed554f9-0494-4890-85cd-17e7b666d556-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447402 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447402 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447402 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eed554f9-0494-4890-85cd-17e7b666d556-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447402 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447402 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-config-volume\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447600 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447423 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eed554f9-0494-4890-85cd-17e7b666d556-config-out\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447600 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447450 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447600 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eed554f9-0494-4890-85cd-17e7b666d556-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447600 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-web-config\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.447600 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.447549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.547981 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.547938 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548172 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.547990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eed554f9-0494-4890-85cd-17e7b666d556-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548172 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-web-config\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548332 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548332 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eed554f9-0494-4890-85cd-17e7b666d556-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548332 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9nj2\" (UniqueName: \"kubernetes.io/projected/eed554f9-0494-4890-85cd-17e7b666d556-kube-api-access-n9nj2\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548332 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eed554f9-0494-4890-85cd-17e7b666d556-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548332 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548571 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548571 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eed554f9-0494-4890-85cd-17e7b666d556-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548571 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548412 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eed554f9-0494-4890-85cd-17e7b666d556-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548571 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548423 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548571 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-config-volume\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.548571 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.548495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eed554f9-0494-4890-85cd-17e7b666d556-config-out\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.549098 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.549074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eed554f9-0494-4890-85cd-17e7b666d556-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.551219 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.551186 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eed554f9-0494-4890-85cd-17e7b666d556-config-out\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.551368 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.551313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.551368 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.551354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.551488 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.551401 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.551488 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.551449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.551488 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.551477 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eed554f9-0494-4890-85cd-17e7b666d556-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.551825 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.551806 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-web-config\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.551915 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.551897 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eed554f9-0494-4890-85cd-17e7b666d556-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.552098 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.552077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.553177 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.553161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eed554f9-0494-4890-85cd-17e7b666d556-config-volume\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.556117 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.556095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9nj2\" (UniqueName: \"kubernetes.io/projected/eed554f9-0494-4890-85cd-17e7b666d556-kube-api-access-n9nj2\") pod \"alertmanager-main-0\" (UID: \"eed554f9-0494-4890-85cd-17e7b666d556\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.640218 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.640165 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 02:44:24.768844 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:24.767338 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 02:44:24.770392 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:44:24.770362 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeed554f9_0494_4890_85cd_17e7b666d556.slice/crio-a4789d682c53eaf72add92fede3d46f97f11f898fb484d3be86dd9d3fa01ca2d WatchSource:0}: Error finding container a4789d682c53eaf72add92fede3d46f97f11f898fb484d3be86dd9d3fa01ca2d: Status 404 returned error can't find the container with id a4789d682c53eaf72add92fede3d46f97f11f898fb484d3be86dd9d3fa01ca2d Apr 21 02:44:25.277208 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:25.277176 2572 generic.go:358] "Generic (PLEG): container finished" podID="eed554f9-0494-4890-85cd-17e7b666d556" containerID="9f3c5fa0d207ba8fa6bdaf64458fe23e85f81ffbf017307152510d23a373a62c" exitCode=0 Apr 21 02:44:25.277208 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:25.277211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eed554f9-0494-4890-85cd-17e7b666d556","Type":"ContainerDied","Data":"9f3c5fa0d207ba8fa6bdaf64458fe23e85f81ffbf017307152510d23a373a62c"} Apr 21 02:44:25.277623 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:25.277251 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eed554f9-0494-4890-85cd-17e7b666d556","Type":"ContainerStarted","Data":"a4789d682c53eaf72add92fede3d46f97f11f898fb484d3be86dd9d3fa01ca2d"} Apr 21 02:44:25.622227 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:25.622199 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="704a669d-91d3-46df-bdd7-8bb9ac616307" path="/var/lib/kubelet/pods/704a669d-91d3-46df-bdd7-8bb9ac616307/volumes" Apr 21 02:44:26.282687 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.282647 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eed554f9-0494-4890-85cd-17e7b666d556","Type":"ContainerStarted","Data":"9ff82f005238bdf1f2409470238c44eb6435305775783da9e02a0736b24790d0"} Apr 21 02:44:26.282687 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.282686 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eed554f9-0494-4890-85cd-17e7b666d556","Type":"ContainerStarted","Data":"925d3b319cba6e40aaf689edcdcc2929e6d32467935bdf6c0ba1df2f974c7789"} Apr 21 02:44:26.283107 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.282699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eed554f9-0494-4890-85cd-17e7b666d556","Type":"ContainerStarted","Data":"a9a923d222dbbb20cd5c4b98b8de206cfe40d55c174add8e4cfb4fd62f240919"} Apr 21 02:44:26.283107 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.282711 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eed554f9-0494-4890-85cd-17e7b666d556","Type":"ContainerStarted","Data":"4d8adca3799fb47f94dc4aff6a967f3aff9070264f09fe9e2a4edcb3a3da3000"} Apr 21 02:44:26.283107 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.282721 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eed554f9-0494-4890-85cd-17e7b666d556","Type":"ContainerStarted","Data":"784a7416edbbfacf640ae67d172e4d332e453895571e4676f95676448c0de784"} Apr 21 02:44:26.283107 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.282731 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eed554f9-0494-4890-85cd-17e7b666d556","Type":"ContainerStarted","Data":"9ac9ef811bbef8b6403b6bab360fa25d396ce067bb4fd393be1cee2e9dd0fb17"} Apr 21 02:44:26.308957 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.308913 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.308898732 podStartE2EDuration="2.308898732s" podCreationTimestamp="2026-04-21 02:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:44:26.307613559 +0000 UTC m=+199.241290665" watchObservedRunningTime="2026-04-21 02:44:26.308898732 +0000 UTC m=+199.242575808" Apr 21 02:44:26.421737 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.421703 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 02:44:26.422399 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.422145 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="prometheus" containerID="cri-o://08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" gracePeriod=600 Apr 21 02:44:26.422399 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.422161 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy" containerID="cri-o://25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" gracePeriod=600 Apr 21 02:44:26.422399 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.422169 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="thanos-sidecar" containerID="cri-o://c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" gracePeriod=600 Apr 21 02:44:26.422399 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.422217 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy-thanos" containerID="cri-o://217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" gracePeriod=600 Apr 21 02:44:26.422399 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.422191 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="config-reloader" containerID="cri-o://0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" gracePeriod=600 Apr 21 02:44:26.422399 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.422286 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy-web" containerID="cri-o://09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" gracePeriod=600 Apr 21 02:44:26.660288 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.660265 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:26.768428 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768394 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-grpc-tls\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768428 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768427 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768453 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-kube-rbac-proxy\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768472 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-thanos-prometheus-http-client-file\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768504 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-config-out\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768559 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-tls-assets\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768596 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-metrics-client-ca\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768625 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-config\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768651 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-rulefiles-0\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768675 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-tls\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768707 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-metrics-client-certs\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768732 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-trusted-ca-bundle\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768782 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768813 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-db\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768842 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-web-config\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768887 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-serving-certs-ca-bundle\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768919 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-kubelet-serving-ca-bundle\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.768999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.768948 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mkw8\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-kube-api-access-7mkw8\") pod \"dd652074-8275-40fb-aece-61b3ca615da7\" (UID: \"dd652074-8275-40fb-aece-61b3ca615da7\") " Apr 21 02:44:26.770171 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.769809 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:44:26.770171 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.769833 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:44:26.770171 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.770150 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:44:26.771045 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.771009 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:44:26.771667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.771339 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:44:26.771667 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.771630 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:44:26.772271 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.772222 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-config-out" (OuterVolumeSpecName: "config-out") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:44:26.772352 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.772270 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:26.772504 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.772484 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:26.772603 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.772507 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:26.772753 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.772716 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:26.773111 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.773077 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-kube-api-access-7mkw8" (OuterVolumeSpecName: "kube-api-access-7mkw8") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "kube-api-access-7mkw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:44:26.773186 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.773121 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:44:26.773186 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.773152 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:26.773505 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.773488 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-config" (OuterVolumeSpecName: "config") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:26.773769 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.773752 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:26.773932 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.773918 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:26.782534 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.782515 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-web-config" (OuterVolumeSpecName: "web-config") pod "dd652074-8275-40fb-aece-61b3ca615da7" (UID: "dd652074-8275-40fb-aece-61b3ca615da7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:44:26.870432 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870367 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-config\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870432 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870395 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870432 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870405 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870432 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870414 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-metrics-client-certs\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870432 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870424 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870432 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870434 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870444 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-prometheus-k8s-db\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870453 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-web-config\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870462 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870471 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870481 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mkw8\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-kube-api-access-7mkw8\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870490 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-grpc-tls\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870499 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870509 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-secret-kube-rbac-proxy\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870518 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd652074-8275-40fb-aece-61b3ca615da7-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870548 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd652074-8275-40fb-aece-61b3ca615da7-config-out\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870559 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd652074-8275-40fb-aece-61b3ca615da7-tls-assets\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:26.870663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:26.870568 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd652074-8275-40fb-aece-61b3ca615da7-configmap-metrics-client-ca\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:44:27.294560 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294523 2572 generic.go:358] "Generic (PLEG): container finished" podID="dd652074-8275-40fb-aece-61b3ca615da7" containerID="217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" exitCode=0 Apr 21 02:44:27.294560 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294547 2572 generic.go:358] "Generic (PLEG): container finished" podID="dd652074-8275-40fb-aece-61b3ca615da7" containerID="25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" exitCode=0 Apr 21 02:44:27.294560 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294553 2572 generic.go:358] "Generic (PLEG): container finished" podID="dd652074-8275-40fb-aece-61b3ca615da7" containerID="09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" exitCode=0 Apr 21 02:44:27.294560 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294559 2572 generic.go:358] "Generic (PLEG): container finished" podID="dd652074-8275-40fb-aece-61b3ca615da7" containerID="c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" exitCode=0 Apr 21 02:44:27.294560 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294564 2572 generic.go:358] "Generic (PLEG): container finished" podID="dd652074-8275-40fb-aece-61b3ca615da7" containerID="0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" exitCode=0 Apr 21 02:44:27.294560 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294569 2572 generic.go:358] "Generic (PLEG): container finished" podID="dd652074-8275-40fb-aece-61b3ca615da7" containerID="08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" exitCode=0 Apr 21 02:44:27.295199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerDied","Data":"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae"} Apr 21 02:44:27.295199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294635 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerDied","Data":"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf"} Apr 21 02:44:27.295199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294647 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerDied","Data":"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c"} Apr 21 02:44:27.295199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294657 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerDied","Data":"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f"} Apr 21 02:44:27.295199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerDied","Data":"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872"} Apr 21 02:44:27.295199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294675 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerDied","Data":"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437"} Apr 21 02:44:27.295199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294684 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"dd652074-8275-40fb-aece-61b3ca615da7","Type":"ContainerDied","Data":"55703bebc14919a54a9bcaad9b325b60c941463eeb687643ea893d07135ce44c"} Apr 21 02:44:27.295199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294672 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.295199 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.294689 2572 scope.go:117] "RemoveContainer" containerID="217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" Apr 21 02:44:27.304193 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.304074 2572 scope.go:117] "RemoveContainer" containerID="25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" Apr 21 02:44:27.311098 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.311081 2572 scope.go:117] "RemoveContainer" containerID="09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" Apr 21 02:44:27.317264 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.317229 2572 scope.go:117] "RemoveContainer" containerID="c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" Apr 21 02:44:27.319561 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.319541 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 02:44:27.324021 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.324000 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 02:44:27.325268 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.325247 2572 scope.go:117] "RemoveContainer" containerID="0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" Apr 21 02:44:27.331111 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.331093 2572 scope.go:117] "RemoveContainer" containerID="08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" Apr 21 02:44:27.337404 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.337386 2572 scope.go:117] "RemoveContainer" containerID="c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39" Apr 21 02:44:27.343367 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.343352 2572 scope.go:117] "RemoveContainer" containerID="217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" Apr 21 02:44:27.343594 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:44:27.343572 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": container with ID starting with 217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae not found: ID does not exist" containerID="217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" Apr 21 02:44:27.343655 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.343607 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae"} err="failed to get container status \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": rpc error: code = NotFound desc = could not find container \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": container with ID starting with 217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae not found: ID does not exist" Apr 21 02:44:27.343655 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.343651 2572 scope.go:117] "RemoveContainer" containerID="25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" Apr 21 02:44:27.343860 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:44:27.343843 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": container with ID starting with 25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf not found: ID does not exist" containerID="25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" Apr 21 02:44:27.343900 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.343866 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf"} err="failed to get container status \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": rpc error: code = NotFound desc = could not find container \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": container with ID starting with 25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf not found: ID does not exist" Apr 21 02:44:27.343900 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.343882 2572 scope.go:117] "RemoveContainer" containerID="09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" Apr 21 02:44:27.344084 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:44:27.344068 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": container with ID starting with 09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c not found: ID does not exist" containerID="09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" Apr 21 02:44:27.344120 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.344086 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c"} err="failed to get container status \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": rpc error: code = NotFound desc = could not find container \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": container with ID starting with 09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c not found: ID does not exist" Apr 21 02:44:27.344120 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.344097 2572 scope.go:117] "RemoveContainer" containerID="c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" Apr 21 02:44:27.344372 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:44:27.344353 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": container with ID starting with c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f not found: ID does not exist" containerID="c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" Apr 21 02:44:27.344428 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.344377 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f"} err="failed to get container status \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": rpc error: code = NotFound desc = could not find container \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": container with ID starting with c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f not found: ID does not exist" Apr 21 02:44:27.344428 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.344392 2572 scope.go:117] "RemoveContainer" containerID="0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" Apr 21 02:44:27.344604 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:44:27.344589 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": container with ID starting with 0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872 not found: ID does not exist" containerID="0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" Apr 21 02:44:27.344641 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.344606 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872"} err="failed to get container status \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": rpc error: code = NotFound desc = could not find container \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": container with ID starting with 0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872 not found: ID does not exist" Apr 21 02:44:27.344641 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.344618 2572 scope.go:117] "RemoveContainer" containerID="08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" Apr 21 02:44:27.344769 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:44:27.344753 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": container with ID starting with 08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437 not found: ID does not exist" containerID="08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" Apr 21 02:44:27.344806 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.344773 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437"} err="failed to get container status \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": rpc error: code = NotFound desc = could not find container \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": container with ID starting with 08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437 not found: ID does not exist" Apr 21 02:44:27.344806 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.344786 2572 scope.go:117] "RemoveContainer" containerID="c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39" Apr 21 02:44:27.345011 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:44:27.344992 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": container with ID starting with c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39 not found: ID does not exist" containerID="c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39" Apr 21 02:44:27.345073 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345019 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39"} err="failed to get container status \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": rpc error: code = NotFound desc = could not find container \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": container with ID starting with c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39 not found: ID does not exist" Apr 21 02:44:27.345073 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345039 2572 scope.go:117] "RemoveContainer" containerID="217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" Apr 21 02:44:27.345310 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345231 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae"} err="failed to get container status \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": rpc error: code = NotFound desc = could not find container \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": container with ID starting with 217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae not found: ID does not exist" Apr 21 02:44:27.345353 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345311 2572 scope.go:117] "RemoveContainer" containerID="25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" Apr 21 02:44:27.345512 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345492 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf"} err="failed to get container status \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": rpc error: code = NotFound desc = could not find container \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": container with ID starting with 25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf not found: ID does not exist" Apr 21 02:44:27.345571 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345514 2572 scope.go:117] "RemoveContainer" containerID="09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" Apr 21 02:44:27.345707 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345692 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c"} err="failed to get container status \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": rpc error: code = NotFound desc = could not find container \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": container with ID starting with 09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c not found: ID does not exist" Apr 21 02:44:27.345754 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345707 2572 scope.go:117] "RemoveContainer" containerID="c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" Apr 21 02:44:27.345869 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345854 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f"} err="failed to get container status \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": rpc error: code = NotFound desc = could not find container \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": container with ID starting with c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f not found: ID does not exist" Apr 21 02:44:27.345906 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.345868 2572 scope.go:117] "RemoveContainer" containerID="0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" Apr 21 02:44:27.346037 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346021 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872"} err="failed to get container status \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": rpc error: code = NotFound desc = could not find container \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": container with ID starting with 0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872 not found: ID does not exist" Apr 21 02:44:27.346074 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346037 2572 scope.go:117] "RemoveContainer" containerID="08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" Apr 21 02:44:27.346209 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346189 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437"} err="failed to get container status \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": rpc error: code = NotFound desc = could not find container \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": container with ID starting with 08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437 not found: ID does not exist" Apr 21 02:44:27.346271 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346210 2572 scope.go:117] "RemoveContainer" containerID="c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39" Apr 21 02:44:27.346404 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346389 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39"} err="failed to get container status \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": rpc error: code = NotFound desc = could not find container \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": container with ID starting with c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39 not found: ID does not exist" Apr 21 02:44:27.346457 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346404 2572 scope.go:117] "RemoveContainer" containerID="217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" Apr 21 02:44:27.346615 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346596 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae"} err="failed to get container status \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": rpc error: code = NotFound desc = could not find container \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": container with ID starting with 217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae not found: ID does not exist" Apr 21 02:44:27.346615 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346614 2572 scope.go:117] "RemoveContainer" containerID="25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" Apr 21 02:44:27.346840 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346824 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf"} err="failed to get container status \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": rpc error: code = NotFound desc = could not find container \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": container with ID starting with 25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf not found: ID does not exist" Apr 21 02:44:27.346877 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346841 2572 scope.go:117] "RemoveContainer" containerID="09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" Apr 21 02:44:27.347012 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.346994 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c"} err="failed to get container status \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": rpc error: code = NotFound desc = could not find container \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": container with ID starting with 09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c not found: ID does not exist" Apr 21 02:44:27.347052 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347012 2572 scope.go:117] "RemoveContainer" containerID="c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" Apr 21 02:44:27.347171 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347154 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f"} err="failed to get container status \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": rpc error: code = NotFound desc = could not find container \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": container with ID starting with c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f not found: ID does not exist" Apr 21 02:44:27.347213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347171 2572 scope.go:117] "RemoveContainer" containerID="0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" Apr 21 02:44:27.347384 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347365 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872"} err="failed to get container status \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": rpc error: code = NotFound desc = could not find container \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": container with ID starting with 0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872 not found: ID does not exist" Apr 21 02:44:27.347458 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347386 2572 scope.go:117] "RemoveContainer" containerID="08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" Apr 21 02:44:27.347588 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347571 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437"} err="failed to get container status \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": rpc error: code = NotFound desc = could not find container \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": container with ID starting with 08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437 not found: ID does not exist" Apr 21 02:44:27.347641 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347589 2572 scope.go:117] "RemoveContainer" containerID="c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39" Apr 21 02:44:27.347807 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347787 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39"} err="failed to get container status \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": rpc error: code = NotFound desc = could not find container \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": container with ID starting with c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39 not found: ID does not exist" Apr 21 02:44:27.347874 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347811 2572 scope.go:117] "RemoveContainer" containerID="217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" Apr 21 02:44:27.347979 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347962 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae"} err="failed to get container status \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": rpc error: code = NotFound desc = could not find container \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": container with ID starting with 217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae not found: ID does not exist" Apr 21 02:44:27.348027 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.347979 2572 scope.go:117] "RemoveContainer" containerID="25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" Apr 21 02:44:27.348153 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348137 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf"} err="failed to get container status \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": rpc error: code = NotFound desc = could not find container \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": container with ID starting with 25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf not found: ID does not exist" Apr 21 02:44:27.348213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348155 2572 scope.go:117] "RemoveContainer" containerID="09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" Apr 21 02:44:27.348348 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348331 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c"} err="failed to get container status \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": rpc error: code = NotFound desc = could not find container \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": container with ID starting with 09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c not found: ID does not exist" Apr 21 02:44:27.348394 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348350 2572 scope.go:117] "RemoveContainer" containerID="c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" Apr 21 02:44:27.348548 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348530 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f"} err="failed to get container status \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": rpc error: code = NotFound desc = could not find container \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": container with ID starting with c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f not found: ID does not exist" Apr 21 02:44:27.348591 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348549 2572 scope.go:117] "RemoveContainer" containerID="0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" Apr 21 02:44:27.348724 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348709 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872"} err="failed to get container status \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": rpc error: code = NotFound desc = could not find container \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": container with ID starting with 0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872 not found: ID does not exist" Apr 21 02:44:27.348760 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348725 2572 scope.go:117] "RemoveContainer" containerID="08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" Apr 21 02:44:27.348885 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348871 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437"} err="failed to get container status \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": rpc error: code = NotFound desc = could not find container \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": container with ID starting with 08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437 not found: ID does not exist" Apr 21 02:44:27.348933 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.348886 2572 scope.go:117] "RemoveContainer" containerID="c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39" Apr 21 02:44:27.349117 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349094 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39"} err="failed to get container status \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": rpc error: code = NotFound desc = could not find container \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": container with ID starting with c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39 not found: ID does not exist" Apr 21 02:44:27.349169 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349118 2572 scope.go:117] "RemoveContainer" containerID="217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" Apr 21 02:44:27.349346 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349327 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae"} err="failed to get container status \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": rpc error: code = NotFound desc = could not find container \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": container with ID starting with 217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae not found: ID does not exist" Apr 21 02:44:27.349393 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349347 2572 scope.go:117] "RemoveContainer" containerID="25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" Apr 21 02:44:27.349559 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349544 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf"} err="failed to get container status \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": rpc error: code = NotFound desc = could not find container \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": container with ID starting with 25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf not found: ID does not exist" Apr 21 02:44:27.349596 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349559 2572 scope.go:117] "RemoveContainer" containerID="09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" Apr 21 02:44:27.349734 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349717 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c"} err="failed to get container status \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": rpc error: code = NotFound desc = could not find container \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": container with ID starting with 09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c not found: ID does not exist" Apr 21 02:44:27.349774 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349734 2572 scope.go:117] "RemoveContainer" containerID="c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" Apr 21 02:44:27.349904 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349890 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f"} err="failed to get container status \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": rpc error: code = NotFound desc = could not find container \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": container with ID starting with c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f not found: ID does not exist" Apr 21 02:44:27.349947 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.349904 2572 scope.go:117] "RemoveContainer" containerID="0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" Apr 21 02:44:27.350105 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350082 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872"} err="failed to get container status \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": rpc error: code = NotFound desc = could not find container \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": container with ID starting with 0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872 not found: ID does not exist" Apr 21 02:44:27.350142 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350106 2572 scope.go:117] "RemoveContainer" containerID="08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" Apr 21 02:44:27.350336 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350317 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437"} err="failed to get container status \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": rpc error: code = NotFound desc = could not find container \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": container with ID starting with 08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437 not found: ID does not exist" Apr 21 02:44:27.350385 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350337 2572 scope.go:117] "RemoveContainer" containerID="c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39" Apr 21 02:44:27.350561 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350543 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39"} err="failed to get container status \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": rpc error: code = NotFound desc = could not find container \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": container with ID starting with c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39 not found: ID does not exist" Apr 21 02:44:27.350605 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350562 2572 scope.go:117] "RemoveContainer" containerID="217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae" Apr 21 02:44:27.350775 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350756 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae"} err="failed to get container status \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": rpc error: code = NotFound desc = could not find container \"217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae\": container with ID starting with 217a6975586bcde2b3a579e761b3a5aeac4d7a5a5f36a596a049f7b3f5822cae not found: ID does not exist" Apr 21 02:44:27.350775 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350774 2572 scope.go:117] "RemoveContainer" containerID="25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf" Apr 21 02:44:27.351025 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350979 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf"} err="failed to get container status \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": rpc error: code = NotFound desc = could not find container \"25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf\": container with ID starting with 25018d29d7150173683a334c71e4a53a312f4bf69700400102429e7b28543dbf not found: ID does not exist" Apr 21 02:44:27.351025 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.350999 2572 scope.go:117] "RemoveContainer" containerID="09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c" Apr 21 02:44:27.351779 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.351357 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c"} err="failed to get container status \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": rpc error: code = NotFound desc = could not find container \"09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c\": container with ID starting with 09409d4cfa5bc8a4ff9b501d16d419f40824bd9d483d13e06971870033011b9c not found: ID does not exist" Apr 21 02:44:27.351779 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.351383 2572 scope.go:117] "RemoveContainer" containerID="c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f" Apr 21 02:44:27.351779 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.351660 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f"} err="failed to get container status \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": rpc error: code = NotFound desc = could not find container \"c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f\": container with ID starting with c721bf6bf7d05f15c3b62982631250c023bd90d4d8c64baebd1744abe2a34f9f not found: ID does not exist" Apr 21 02:44:27.351779 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.351681 2572 scope.go:117] "RemoveContainer" containerID="0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872" Apr 21 02:44:27.352001 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.351974 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872"} err="failed to get container status \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": rpc error: code = NotFound desc = could not find container \"0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872\": container with ID starting with 0a4b3c5deda7b43a01bc6c646f279a44c622b0189154a6e12ac8db0c7e032872 not found: ID does not exist" Apr 21 02:44:27.352001 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.351998 2572 scope.go:117] "RemoveContainer" containerID="08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437" Apr 21 02:44:27.352312 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.352288 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437"} err="failed to get container status \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": rpc error: code = NotFound desc = could not find container \"08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437\": container with ID starting with 08ffc9cc96ec2e86d85d2d0c63ff1e27ecc1c68860d169e67a4748059af92437 not found: ID does not exist" Apr 21 02:44:27.352437 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.352312 2572 scope.go:117] "RemoveContainer" containerID="c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39" Apr 21 02:44:27.352672 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.352641 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39"} err="failed to get container status \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": rpc error: code = NotFound desc = could not find container \"c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39\": container with ID starting with c20bc0e02cbdc1d6e78672165367e980f2f8a75b35a1cea4050c96f336522f39 not found: ID does not exist" Apr 21 02:44:27.353477 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353458 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 02:44:27.353764 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353752 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="prometheus" Apr 21 02:44:27.353819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353765 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="prometheus" Apr 21 02:44:27.353819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353780 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="config-reloader" Apr 21 02:44:27.353819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353786 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="config-reloader" Apr 21 02:44:27.353819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353795 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="init-config-reloader" Apr 21 02:44:27.353819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353800 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="init-config-reloader" Apr 21 02:44:27.353819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353807 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="thanos-sidecar" Apr 21 02:44:27.353819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353812 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="thanos-sidecar" Apr 21 02:44:27.353819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353817 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy-thanos" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353823 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy-thanos" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353830 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy-web" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353835 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy-web" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353848 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353852 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353898 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy-web" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353905 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="thanos-sidecar" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353911 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="config-reloader" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353920 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy-thanos" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353926 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="prometheus" Apr 21 02:44:27.354033 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.353931 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd652074-8275-40fb-aece-61b3ca615da7" containerName="kube-rbac-proxy" Apr 21 02:44:27.359069 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.359055 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.361389 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.361367 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 02:44:27.361482 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.361409 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 02:44:27.361482 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.361431 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 02:44:27.361724 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.361705 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 02:44:27.361814 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.361750 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 02:44:27.361814 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.361771 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 02:44:27.361923 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.361811 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-fv883225ab67k\"" Apr 21 02:44:27.361923 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.361850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 02:44:27.361923 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.361867 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 02:44:27.362257 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.362222 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-pv999\"" Apr 21 02:44:27.362324 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.362281 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 02:44:27.362324 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.362288 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 02:44:27.362984 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.362969 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 02:44:27.364747 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.364730 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 02:44:27.368534 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.368515 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 02:44:27.380155 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.380137 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 02:44:27.474556 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474490 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hws2j\" (UniqueName: \"kubernetes.io/projected/b56b5aa0-1d7a-4b5c-942a-f89eed682509-kube-api-access-hws2j\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.474556 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.474850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b56b5aa0-1d7a-4b5c-942a-f89eed682509-config-out\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.474850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474581 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.474850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.474850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.474850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474684 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.474850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474747 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.474850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-web-config\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.474850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-config\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.475213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474898 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.475213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474937 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.475213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.475213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.474998 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.475213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.475083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b56b5aa0-1d7a-4b5c-942a-f89eed682509-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.475213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.475125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.475213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.475153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b56b5aa0-1d7a-4b5c-942a-f89eed682509-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.475213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.475173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576342 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576313 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-web-config\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576342 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-config\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576564 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576564 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576564 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576564 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576564 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b56b5aa0-1d7a-4b5c-942a-f89eed682509-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576564 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576564 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b56b5aa0-1d7a-4b5c-942a-f89eed682509-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576910 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576910 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hws2j\" (UniqueName: \"kubernetes.io/projected/b56b5aa0-1d7a-4b5c-942a-f89eed682509-kube-api-access-hws2j\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576910 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576910 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b56b5aa0-1d7a-4b5c-942a-f89eed682509-config-out\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576910 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576910 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576910 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.576910 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.576862 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b56b5aa0-1d7a-4b5c-942a-f89eed682509-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.577370 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.577348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.577436 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.577408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.577759 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.577734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.577988 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.577959 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.579966 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.579156 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-config\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.579966 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.579166 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.579966 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.579471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-web-config\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.580171 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.580066 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.580171 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.580074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.580777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.580493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.580777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.580661 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.580777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.580736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.581020 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.580896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.581020 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.581000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b56b5aa0-1d7a-4b5c-942a-f89eed682509-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.581765 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.581737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.582442 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.582424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b56b5aa0-1d7a-4b5c-942a-f89eed682509-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.582673 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.582659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b56b5aa0-1d7a-4b5c-942a-f89eed682509-config-out\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.582715 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.582674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b56b5aa0-1d7a-4b5c-942a-f89eed682509-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.584519 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.584498 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hws2j\" (UniqueName: \"kubernetes.io/projected/b56b5aa0-1d7a-4b5c-942a-f89eed682509-kube-api-access-hws2j\") pod \"prometheus-k8s-0\" (UID: \"b56b5aa0-1d7a-4b5c-942a-f89eed682509\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.622618 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.622582 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd652074-8275-40fb-aece-61b3ca615da7" path="/var/lib/kubelet/pods/dd652074-8275-40fb-aece-61b3ca615da7/volumes" Apr 21 02:44:27.669507 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.669480 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:44:27.791501 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:27.791477 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 02:44:27.793397 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:44:27.793370 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb56b5aa0_1d7a_4b5c_942a_f89eed682509.slice/crio-8010e28dbe8438e28c66c57a1d68d22506d8599a9e2c26595b574ce7902f30a8 WatchSource:0}: Error finding container 8010e28dbe8438e28c66c57a1d68d22506d8599a9e2c26595b574ce7902f30a8: Status 404 returned error can't find the container with id 8010e28dbe8438e28c66c57a1d68d22506d8599a9e2c26595b574ce7902f30a8 Apr 21 02:44:28.300113 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:28.300072 2572 generic.go:358] "Generic (PLEG): container finished" podID="b56b5aa0-1d7a-4b5c-942a-f89eed682509" containerID="44b96797e218088674616f3910695187072469b38b7eb3fdf8afe2b3a638d01e" exitCode=0 Apr 21 02:44:28.300565 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:28.300159 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b56b5aa0-1d7a-4b5c-942a-f89eed682509","Type":"ContainerDied","Data":"44b96797e218088674616f3910695187072469b38b7eb3fdf8afe2b3a638d01e"} Apr 21 02:44:28.300565 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:28.300195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b56b5aa0-1d7a-4b5c-942a-f89eed682509","Type":"ContainerStarted","Data":"8010e28dbe8438e28c66c57a1d68d22506d8599a9e2c26595b574ce7902f30a8"} Apr 21 02:44:29.309484 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:29.309446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b56b5aa0-1d7a-4b5c-942a-f89eed682509","Type":"ContainerStarted","Data":"2f2c131dc47e8f984ba6f44afbe1f65dab94a6071159fe34daacb2a129a3927f"} Apr 21 02:44:29.309484 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:29.309485 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b56b5aa0-1d7a-4b5c-942a-f89eed682509","Type":"ContainerStarted","Data":"ee8c01524fc3de9472a21f20e60dcf31ab4a564f439eb6e7482d718dc9c8157c"} Apr 21 02:44:29.309954 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:29.309498 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b56b5aa0-1d7a-4b5c-942a-f89eed682509","Type":"ContainerStarted","Data":"0f4ef82278c320a370d140ccf0b0c4eae97bdb7d35347ab85ca2d57855030b12"} Apr 21 02:44:29.309954 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:29.309509 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b56b5aa0-1d7a-4b5c-942a-f89eed682509","Type":"ContainerStarted","Data":"af4e7ea342b8f64c4ab550b0551d8d855ff836c65552ae05ffaef5af5e6e7ad7"} Apr 21 02:44:29.309954 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:29.309520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b56b5aa0-1d7a-4b5c-942a-f89eed682509","Type":"ContainerStarted","Data":"e291c34e567053d46e4aec8ca9b3e529160c1e680adda50c0bb359e6c838d234"} Apr 21 02:44:29.309954 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:29.309530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b56b5aa0-1d7a-4b5c-942a-f89eed682509","Type":"ContainerStarted","Data":"a4143a94d9a9cb1180f15c4cb966cee7b33e418484d8d3217573284d47d05e1e"} Apr 21 02:44:29.335454 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:29.335411 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.335395633 podStartE2EDuration="2.335395633s" podCreationTimestamp="2026-04-21 02:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:44:29.334354117 +0000 UTC m=+202.268031192" watchObservedRunningTime="2026-04-21 02:44:29.335395633 +0000 UTC m=+202.269072705" Apr 21 02:44:32.670490 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:44:32.670437 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:45:27.670685 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:45:27.670607 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:45:27.685907 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:45:27.685885 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:45:28.497546 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:45:28.497515 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 02:46:07.461291 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:46:07.461261 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:46:07.461834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:46:07.461269 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:46:07.470314 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:46:07.470292 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 02:47:18.990219 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:18.990183 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh"] Apr 21 02:47:18.993420 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:18.993400 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:18.995615 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:18.995590 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 02:47:18.996806 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:18.996787 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 02:47:18.997272 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:18.997229 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:47:18.997378 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:18.997299 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5js6v\"" Apr 21 02:47:18.997378 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:18.997314 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 02:47:18.997378 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:18.997304 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 02:47:19.004474 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.004449 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh"] Apr 21 02:47:19.112251 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.112210 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67ff259c-a10b-4e74-960e-826ad8e8f7d8-cert\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.112436 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.112264 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/67ff259c-a10b-4e74-960e-826ad8e8f7d8-manager-config\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.112436 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.112309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppg9c\" (UniqueName: \"kubernetes.io/projected/67ff259c-a10b-4e74-960e-826ad8e8f7d8-kube-api-access-ppg9c\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.112436 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.112397 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/67ff259c-a10b-4e74-960e-826ad8e8f7d8-metrics-cert\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.213631 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.213598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67ff259c-a10b-4e74-960e-826ad8e8f7d8-cert\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.213631 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.213635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/67ff259c-a10b-4e74-960e-826ad8e8f7d8-manager-config\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.213938 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.213664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppg9c\" (UniqueName: \"kubernetes.io/projected/67ff259c-a10b-4e74-960e-826ad8e8f7d8-kube-api-access-ppg9c\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.213938 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.213693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/67ff259c-a10b-4e74-960e-826ad8e8f7d8-metrics-cert\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.214309 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.214287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/67ff259c-a10b-4e74-960e-826ad8e8f7d8-manager-config\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.216422 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.216398 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/67ff259c-a10b-4e74-960e-826ad8e8f7d8-metrics-cert\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.216518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.216398 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67ff259c-a10b-4e74-960e-826ad8e8f7d8-cert\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.229262 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.229222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppg9c\" (UniqueName: \"kubernetes.io/projected/67ff259c-a10b-4e74-960e-826ad8e8f7d8-kube-api-access-ppg9c\") pod \"lws-controller-manager-64dc57f969-bmrzh\" (UID: \"67ff259c-a10b-4e74-960e-826ad8e8f7d8\") " pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.303415 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.303338 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:19.422643 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.422589 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh"] Apr 21 02:47:19.424985 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:47:19.424957 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ff259c_a10b_4e74_960e_826ad8e8f7d8.slice/crio-e65c893ff086d1d0f38d132a44a9b1a0f2febf6c3610d52e8555ded43629370e WatchSource:0}: Error finding container e65c893ff086d1d0f38d132a44a9b1a0f2febf6c3610d52e8555ded43629370e: Status 404 returned error can't find the container with id e65c893ff086d1d0f38d132a44a9b1a0f2febf6c3610d52e8555ded43629370e Apr 21 02:47:19.426857 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.426838 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:47:19.800046 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.800017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" event={"ID":"67ff259c-a10b-4e74-960e-826ad8e8f7d8","Type":"ContainerStarted","Data":"e65c893ff086d1d0f38d132a44a9b1a0f2febf6c3610d52e8555ded43629370e"} Apr 21 02:47:19.940572 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.940544 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh"] Apr 21 02:47:19.945138 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.945116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:19.947192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.947169 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 02:47:19.947619 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.947600 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 02:47:19.947711 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.947660 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 02:47:19.947873 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.947855 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pmqhb\"" Apr 21 02:47:19.948383 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.948365 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 02:47:19.968065 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:19.968043 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh"] Apr 21 02:47:20.024222 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.024191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/274dfc18-7aac-4617-8591-baab33d20f16-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-zcjlh\" (UID: \"274dfc18-7aac-4617-8591-baab33d20f16\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.024587 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.024229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/274dfc18-7aac-4617-8591-baab33d20f16-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-zcjlh\" (UID: \"274dfc18-7aac-4617-8591-baab33d20f16\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.024587 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.024346 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79pt5\" (UniqueName: \"kubernetes.io/projected/274dfc18-7aac-4617-8591-baab33d20f16-kube-api-access-79pt5\") pod \"opendatahub-operator-controller-manager-5f4d6bff-zcjlh\" (UID: \"274dfc18-7aac-4617-8591-baab33d20f16\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.125777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.125696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/274dfc18-7aac-4617-8591-baab33d20f16-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-zcjlh\" (UID: \"274dfc18-7aac-4617-8591-baab33d20f16\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.125777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.125744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/274dfc18-7aac-4617-8591-baab33d20f16-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-zcjlh\" (UID: \"274dfc18-7aac-4617-8591-baab33d20f16\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.125990 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.125819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79pt5\" (UniqueName: \"kubernetes.io/projected/274dfc18-7aac-4617-8591-baab33d20f16-kube-api-access-79pt5\") pod \"opendatahub-operator-controller-manager-5f4d6bff-zcjlh\" (UID: \"274dfc18-7aac-4617-8591-baab33d20f16\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.128405 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.128383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/274dfc18-7aac-4617-8591-baab33d20f16-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-zcjlh\" (UID: \"274dfc18-7aac-4617-8591-baab33d20f16\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.128608 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.128586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/274dfc18-7aac-4617-8591-baab33d20f16-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-zcjlh\" (UID: \"274dfc18-7aac-4617-8591-baab33d20f16\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.134519 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.134495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79pt5\" (UniqueName: \"kubernetes.io/projected/274dfc18-7aac-4617-8591-baab33d20f16-kube-api-access-79pt5\") pod \"opendatahub-operator-controller-manager-5f4d6bff-zcjlh\" (UID: \"274dfc18-7aac-4617-8591-baab33d20f16\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.256255 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.256193 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:20.422725 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.422667 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh"] Apr 21 02:47:20.428015 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:47:20.427979 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274dfc18_7aac_4617_8591_baab33d20f16.slice/crio-fef1298321cc356abc6142d12bdc69a5b541f15f1b4549b3b90dbcefc1797df5 WatchSource:0}: Error finding container fef1298321cc356abc6142d12bdc69a5b541f15f1b4549b3b90dbcefc1797df5: Status 404 returned error can't find the container with id fef1298321cc356abc6142d12bdc69a5b541f15f1b4549b3b90dbcefc1797df5 Apr 21 02:47:20.805250 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:20.805146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" event={"ID":"274dfc18-7aac-4617-8591-baab33d20f16","Type":"ContainerStarted","Data":"fef1298321cc356abc6142d12bdc69a5b541f15f1b4549b3b90dbcefc1797df5"} Apr 21 02:47:23.817059 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:23.817021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" event={"ID":"274dfc18-7aac-4617-8591-baab33d20f16","Type":"ContainerStarted","Data":"81e04debe96c7443879d5498c233aaaa8f489d9715474727cd7b318b9103e5fc"} Apr 21 02:47:23.817519 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:23.817184 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:23.818333 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:23.818311 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" event={"ID":"67ff259c-a10b-4e74-960e-826ad8e8f7d8","Type":"ContainerStarted","Data":"b57c4446d0222eab8d9561e80f1e5dd903b0d352225fda50f9fb5ef8ea751351"} Apr 21 02:47:23.818429 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:23.818422 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:23.836296 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:23.836230 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" podStartSLOduration=2.019436037 podStartE2EDuration="4.836214844s" podCreationTimestamp="2026-04-21 02:47:19 +0000 UTC" firstStartedPulling="2026-04-21 02:47:20.430257452 +0000 UTC m=+373.363934507" lastFinishedPulling="2026-04-21 02:47:23.247036243 +0000 UTC m=+376.180713314" observedRunningTime="2026-04-21 02:47:23.834150267 +0000 UTC m=+376.767827343" watchObservedRunningTime="2026-04-21 02:47:23.836214844 +0000 UTC m=+376.769891921" Apr 21 02:47:23.853419 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:23.853370 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" podStartSLOduration=2.072327793 podStartE2EDuration="5.853357154s" podCreationTimestamp="2026-04-21 02:47:18 +0000 UTC" firstStartedPulling="2026-04-21 02:47:19.426963719 +0000 UTC m=+372.360640773" lastFinishedPulling="2026-04-21 02:47:23.207993076 +0000 UTC m=+376.141670134" observedRunningTime="2026-04-21 02:47:23.851899594 +0000 UTC m=+376.785576681" watchObservedRunningTime="2026-04-21 02:47:23.853357154 +0000 UTC m=+376.787034237" Apr 21 02:47:34.824428 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:34.824395 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-64dc57f969-bmrzh" Apr 21 02:47:34.824827 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:34.824444 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-zcjlh" Apr 21 02:47:50.475070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.475023 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5"] Apr 21 02:47:50.481361 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.481338 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.483826 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.483799 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-2526x\"" Apr 21 02:47:50.484063 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.484044 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 02:47:50.484506 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.484487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 02:47:50.485448 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.485425 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5"] Apr 21 02:47:50.575929 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.575895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntk7r\" (UniqueName: \"kubernetes.io/projected/a90ecd6e-461f-481a-9742-b405e68ba59e-kube-api-access-ntk7r\") pod \"kube-auth-proxy-55fc66fcf7-rd6z5\" (UID: \"a90ecd6e-461f-481a-9742-b405e68ba59e\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.576108 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.575946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a90ecd6e-461f-481a-9742-b405e68ba59e-tls-certs\") pod \"kube-auth-proxy-55fc66fcf7-rd6z5\" (UID: \"a90ecd6e-461f-481a-9742-b405e68ba59e\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.576108 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.575977 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a90ecd6e-461f-481a-9742-b405e68ba59e-tmp\") pod \"kube-auth-proxy-55fc66fcf7-rd6z5\" (UID: \"a90ecd6e-461f-481a-9742-b405e68ba59e\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.677332 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.677304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a90ecd6e-461f-481a-9742-b405e68ba59e-tls-certs\") pod \"kube-auth-proxy-55fc66fcf7-rd6z5\" (UID: \"a90ecd6e-461f-481a-9742-b405e68ba59e\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.677476 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.677351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a90ecd6e-461f-481a-9742-b405e68ba59e-tmp\") pod \"kube-auth-proxy-55fc66fcf7-rd6z5\" (UID: \"a90ecd6e-461f-481a-9742-b405e68ba59e\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.677476 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.677399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntk7r\" (UniqueName: \"kubernetes.io/projected/a90ecd6e-461f-481a-9742-b405e68ba59e-kube-api-access-ntk7r\") pod \"kube-auth-proxy-55fc66fcf7-rd6z5\" (UID: \"a90ecd6e-461f-481a-9742-b405e68ba59e\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.679625 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.679602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a90ecd6e-461f-481a-9742-b405e68ba59e-tmp\") pod \"kube-auth-proxy-55fc66fcf7-rd6z5\" (UID: \"a90ecd6e-461f-481a-9742-b405e68ba59e\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.679842 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.679820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a90ecd6e-461f-481a-9742-b405e68ba59e-tls-certs\") pod \"kube-auth-proxy-55fc66fcf7-rd6z5\" (UID: \"a90ecd6e-461f-481a-9742-b405e68ba59e\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.686401 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.686377 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntk7r\" (UniqueName: \"kubernetes.io/projected/a90ecd6e-461f-481a-9742-b405e68ba59e-kube-api-access-ntk7r\") pod \"kube-auth-proxy-55fc66fcf7-rd6z5\" (UID: \"a90ecd6e-461f-481a-9742-b405e68ba59e\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.792322 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.792249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" Apr 21 02:47:50.908370 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:50.908341 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5"] Apr 21 02:47:50.910678 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:47:50.910649 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90ecd6e_461f_481a_9742_b405e68ba59e.slice/crio-42438083f418910aeb9e471a4a248dfd82f5f72e55c1bd9fe3ed96cf4588e582 WatchSource:0}: Error finding container 42438083f418910aeb9e471a4a248dfd82f5f72e55c1bd9fe3ed96cf4588e582: Status 404 returned error can't find the container with id 42438083f418910aeb9e471a4a248dfd82f5f72e55c1bd9fe3ed96cf4588e582 Apr 21 02:47:51.913906 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:51.913858 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" event={"ID":"a90ecd6e-461f-481a-9742-b405e68ba59e","Type":"ContainerStarted","Data":"42438083f418910aeb9e471a4a248dfd82f5f72e55c1bd9fe3ed96cf4588e582"} Apr 21 02:47:53.922146 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:53.922112 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" event={"ID":"a90ecd6e-461f-481a-9742-b405e68ba59e","Type":"ContainerStarted","Data":"cb2b5d99127ac5dd84ea82a971d28aa361f15c88f2dabda7facfddfdec2ab311"} Apr 21 02:47:53.937102 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:47:53.937057 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-rd6z5" podStartSLOduration=1.020337881 podStartE2EDuration="3.937041726s" podCreationTimestamp="2026-04-21 02:47:50 +0000 UTC" firstStartedPulling="2026-04-21 02:47:50.912264672 +0000 UTC m=+403.845941725" lastFinishedPulling="2026-04-21 02:47:53.828968513 +0000 UTC m=+406.762645570" observedRunningTime="2026-04-21 02:47:53.935924133 +0000 UTC m=+406.869601211" watchObservedRunningTime="2026-04-21 02:47:53.937041726 +0000 UTC m=+406.870718803" Apr 21 02:49:30.773899 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.773865 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp"] Apr 21 02:49:30.777166 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.777148 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:30.779315 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.779287 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 02:49:30.779461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.779349 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 02:49:30.780005 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.779986 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9hwh5\"" Apr 21 02:49:30.780102 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.780006 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 02:49:30.780102 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.780069 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 02:49:30.783890 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.783869 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp"] Apr 21 02:49:30.928346 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.928304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/021bd13d-9c14-450d-a843-0d7a2e31b55d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-gsqfp\" (UID: \"021bd13d-9c14-450d-a843-0d7a2e31b55d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:30.928346 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.928351 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k572\" (UniqueName: \"kubernetes.io/projected/021bd13d-9c14-450d-a843-0d7a2e31b55d-kube-api-access-9k572\") pod \"kuadrant-console-plugin-6cb54b5c86-gsqfp\" (UID: \"021bd13d-9c14-450d-a843-0d7a2e31b55d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:30.928544 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:30.928459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/021bd13d-9c14-450d-a843-0d7a2e31b55d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-gsqfp\" (UID: \"021bd13d-9c14-450d-a843-0d7a2e31b55d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:31.029447 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:31.029366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/021bd13d-9c14-450d-a843-0d7a2e31b55d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-gsqfp\" (UID: \"021bd13d-9c14-450d-a843-0d7a2e31b55d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:31.029447 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:31.029420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/021bd13d-9c14-450d-a843-0d7a2e31b55d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-gsqfp\" (UID: \"021bd13d-9c14-450d-a843-0d7a2e31b55d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:31.029636 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:31.029452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k572\" (UniqueName: \"kubernetes.io/projected/021bd13d-9c14-450d-a843-0d7a2e31b55d-kube-api-access-9k572\") pod \"kuadrant-console-plugin-6cb54b5c86-gsqfp\" (UID: \"021bd13d-9c14-450d-a843-0d7a2e31b55d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:31.030103 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:31.030080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/021bd13d-9c14-450d-a843-0d7a2e31b55d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-gsqfp\" (UID: \"021bd13d-9c14-450d-a843-0d7a2e31b55d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:31.031921 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:31.031904 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/021bd13d-9c14-450d-a843-0d7a2e31b55d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-gsqfp\" (UID: \"021bd13d-9c14-450d-a843-0d7a2e31b55d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:31.037969 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:31.037945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k572\" (UniqueName: \"kubernetes.io/projected/021bd13d-9c14-450d-a843-0d7a2e31b55d-kube-api-access-9k572\") pod \"kuadrant-console-plugin-6cb54b5c86-gsqfp\" (UID: \"021bd13d-9c14-450d-a843-0d7a2e31b55d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:31.097477 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:31.097443 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" Apr 21 02:49:31.220301 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:31.220277 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp"] Apr 21 02:49:31.225563 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:49:31.225533 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod021bd13d_9c14_450d_a843_0d7a2e31b55d.slice/crio-277c830f4c47b17e768c33017c5b4a8f9126d3ca9616de6a979cd7cf6377836a WatchSource:0}: Error finding container 277c830f4c47b17e768c33017c5b4a8f9126d3ca9616de6a979cd7cf6377836a: Status 404 returned error can't find the container with id 277c830f4c47b17e768c33017c5b4a8f9126d3ca9616de6a979cd7cf6377836a Apr 21 02:49:31.240861 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:31.240834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" event={"ID":"021bd13d-9c14-450d-a843-0d7a2e31b55d","Type":"ContainerStarted","Data":"277c830f4c47b17e768c33017c5b4a8f9126d3ca9616de6a979cd7cf6377836a"} Apr 21 02:49:54.324166 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:54.324121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" event={"ID":"021bd13d-9c14-450d-a843-0d7a2e31b55d","Type":"ContainerStarted","Data":"2d433dd73837141a04c7e221461c5658c40083f5a331a99b6bab51954fce1171"} Apr 21 02:49:54.365501 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:49:54.365400 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gsqfp" podStartSLOduration=1.50296495 podStartE2EDuration="24.365388523s" podCreationTimestamp="2026-04-21 02:49:30 +0000 UTC" firstStartedPulling="2026-04-21 02:49:31.226937154 +0000 UTC m=+504.160614208" lastFinishedPulling="2026-04-21 02:49:54.089360722 +0000 UTC m=+527.023037781" observedRunningTime="2026-04-21 02:49:54.364367417 +0000 UTC m=+527.298044494" watchObservedRunningTime="2026-04-21 02:49:54.365388523 +0000 UTC m=+527.299065599" Apr 21 02:50:32.967461 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:32.967429 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4n9dr"] Apr 21 02:50:32.971003 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:32.970981 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" Apr 21 02:50:32.973209 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:32.973182 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-sscht\"" Apr 21 02:50:32.978538 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:32.978510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn64x\" (UniqueName: \"kubernetes.io/projected/be4ec63b-6271-440c-b545-953303b66d7c-kube-api-access-qn64x\") pod \"authorino-f99f4b5cd-4n9dr\" (UID: \"be4ec63b-6271-440c-b545-953303b66d7c\") " pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" Apr 21 02:50:32.979167 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:32.979138 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4n9dr"] Apr 21 02:50:33.079868 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.079834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn64x\" (UniqueName: \"kubernetes.io/projected/be4ec63b-6271-440c-b545-953303b66d7c-kube-api-access-qn64x\") pod \"authorino-f99f4b5cd-4n9dr\" (UID: \"be4ec63b-6271-440c-b545-953303b66d7c\") " pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" Apr 21 02:50:33.087379 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.087357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn64x\" (UniqueName: \"kubernetes.io/projected/be4ec63b-6271-440c-b545-953303b66d7c-kube-api-access-qn64x\") pod \"authorino-f99f4b5cd-4n9dr\" (UID: \"be4ec63b-6271-440c-b545-953303b66d7c\") " pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" Apr 21 02:50:33.157804 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.157773 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-llntq"] Apr 21 02:50:33.161297 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.161280 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-llntq" Apr 21 02:50:33.166673 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.166651 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-llntq"] Apr 21 02:50:33.180275 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.180252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-262dx\" (UniqueName: \"kubernetes.io/projected/f6af4a30-fd04-4a85-bec6-c6737b9bf95d-kube-api-access-262dx\") pod \"authorino-7498df8756-llntq\" (UID: \"f6af4a30-fd04-4a85-bec6-c6737b9bf95d\") " pod="kuadrant-system/authorino-7498df8756-llntq" Apr 21 02:50:33.281135 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.281047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-262dx\" (UniqueName: \"kubernetes.io/projected/f6af4a30-fd04-4a85-bec6-c6737b9bf95d-kube-api-access-262dx\") pod \"authorino-7498df8756-llntq\" (UID: \"f6af4a30-fd04-4a85-bec6-c6737b9bf95d\") " pod="kuadrant-system/authorino-7498df8756-llntq" Apr 21 02:50:33.281135 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.281098 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" Apr 21 02:50:33.288198 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.288173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-262dx\" (UniqueName: \"kubernetes.io/projected/f6af4a30-fd04-4a85-bec6-c6737b9bf95d-kube-api-access-262dx\") pod \"authorino-7498df8756-llntq\" (UID: \"f6af4a30-fd04-4a85-bec6-c6737b9bf95d\") " pod="kuadrant-system/authorino-7498df8756-llntq" Apr 21 02:50:33.396868 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.396844 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4n9dr"] Apr 21 02:50:33.399423 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:50:33.399392 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe4ec63b_6271_440c_b545_953303b66d7c.slice/crio-2bc8489a084b4c5e23b0ff7957157a900ab801cdb7c9fccb7fab1a32cc48b134 WatchSource:0}: Error finding container 2bc8489a084b4c5e23b0ff7957157a900ab801cdb7c9fccb7fab1a32cc48b134: Status 404 returned error can't find the container with id 2bc8489a084b4c5e23b0ff7957157a900ab801cdb7c9fccb7fab1a32cc48b134 Apr 21 02:50:33.453598 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.453570 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" event={"ID":"be4ec63b-6271-440c-b545-953303b66d7c","Type":"ContainerStarted","Data":"2bc8489a084b4c5e23b0ff7957157a900ab801cdb7c9fccb7fab1a32cc48b134"} Apr 21 02:50:33.471765 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.471736 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-llntq" Apr 21 02:50:33.590807 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:33.590783 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-llntq"] Apr 21 02:50:33.593075 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:50:33.593032 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6af4a30_fd04_4a85_bec6_c6737b9bf95d.slice/crio-c26204f41770df1473fc1172c83e365b6c7f700ff62865d4a716d87a728baf21 WatchSource:0}: Error finding container c26204f41770df1473fc1172c83e365b6c7f700ff62865d4a716d87a728baf21: Status 404 returned error can't find the container with id c26204f41770df1473fc1172c83e365b6c7f700ff62865d4a716d87a728baf21 Apr 21 02:50:34.460907 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:34.460862 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-llntq" event={"ID":"f6af4a30-fd04-4a85-bec6-c6737b9bf95d","Type":"ContainerStarted","Data":"c26204f41770df1473fc1172c83e365b6c7f700ff62865d4a716d87a728baf21"} Apr 21 02:50:37.474365 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:37.474331 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" event={"ID":"be4ec63b-6271-440c-b545-953303b66d7c","Type":"ContainerStarted","Data":"8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9"} Apr 21 02:50:37.475655 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:37.475633 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-llntq" event={"ID":"f6af4a30-fd04-4a85-bec6-c6737b9bf95d","Type":"ContainerStarted","Data":"cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132"} Apr 21 02:50:37.489447 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:37.489407 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" podStartSLOduration=1.992739021 podStartE2EDuration="5.48939584s" podCreationTimestamp="2026-04-21 02:50:32 +0000 UTC" firstStartedPulling="2026-04-21 02:50:33.400655482 +0000 UTC m=+566.334332536" lastFinishedPulling="2026-04-21 02:50:36.897312301 +0000 UTC m=+569.830989355" observedRunningTime="2026-04-21 02:50:37.487676671 +0000 UTC m=+570.421353772" watchObservedRunningTime="2026-04-21 02:50:37.48939584 +0000 UTC m=+570.423072915" Apr 21 02:50:37.501142 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:37.501100 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-llntq" podStartSLOduration=1.209784661 podStartE2EDuration="4.501086547s" podCreationTimestamp="2026-04-21 02:50:33 +0000 UTC" firstStartedPulling="2026-04-21 02:50:33.594360312 +0000 UTC m=+566.528037366" lastFinishedPulling="2026-04-21 02:50:36.885662194 +0000 UTC m=+569.819339252" observedRunningTime="2026-04-21 02:50:37.500439812 +0000 UTC m=+570.434116890" watchObservedRunningTime="2026-04-21 02:50:37.501086547 +0000 UTC m=+570.434763623" Apr 21 02:50:37.553470 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:37.553438 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4n9dr"] Apr 21 02:50:39.482158 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:39.482118 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" podUID="be4ec63b-6271-440c-b545-953303b66d7c" containerName="authorino" containerID="cri-o://8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9" gracePeriod=30 Apr 21 02:50:39.720525 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:39.720502 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" Apr 21 02:50:39.836294 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:39.836176 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn64x\" (UniqueName: \"kubernetes.io/projected/be4ec63b-6271-440c-b545-953303b66d7c-kube-api-access-qn64x\") pod \"be4ec63b-6271-440c-b545-953303b66d7c\" (UID: \"be4ec63b-6271-440c-b545-953303b66d7c\") " Apr 21 02:50:39.838439 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:39.838415 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4ec63b-6271-440c-b545-953303b66d7c-kube-api-access-qn64x" (OuterVolumeSpecName: "kube-api-access-qn64x") pod "be4ec63b-6271-440c-b545-953303b66d7c" (UID: "be4ec63b-6271-440c-b545-953303b66d7c"). InnerVolumeSpecName "kube-api-access-qn64x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:50:39.936805 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:39.936777 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qn64x\" (UniqueName: \"kubernetes.io/projected/be4ec63b-6271-440c-b545-953303b66d7c-kube-api-access-qn64x\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:50:40.486790 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:40.486755 2572 generic.go:358] "Generic (PLEG): container finished" podID="be4ec63b-6271-440c-b545-953303b66d7c" containerID="8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9" exitCode=0 Apr 21 02:50:40.487196 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:40.486800 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" event={"ID":"be4ec63b-6271-440c-b545-953303b66d7c","Type":"ContainerDied","Data":"8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9"} Apr 21 02:50:40.487196 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:40.486807 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" Apr 21 02:50:40.487196 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:40.486821 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4n9dr" event={"ID":"be4ec63b-6271-440c-b545-953303b66d7c","Type":"ContainerDied","Data":"2bc8489a084b4c5e23b0ff7957157a900ab801cdb7c9fccb7fab1a32cc48b134"} Apr 21 02:50:40.487196 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:40.486837 2572 scope.go:117] "RemoveContainer" containerID="8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9" Apr 21 02:50:40.494976 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:40.494948 2572 scope.go:117] "RemoveContainer" containerID="8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9" Apr 21 02:50:40.495266 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:50:40.495213 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9\": container with ID starting with 8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9 not found: ID does not exist" containerID="8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9" Apr 21 02:50:40.495332 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:40.495258 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9"} err="failed to get container status \"8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9\": rpc error: code = NotFound desc = could not find container \"8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9\": container with ID starting with 8ac3d6b2c0e709a80b9d7929889f24d0e67db33052d516d1bcdf813e21829ee9 not found: ID does not exist" Apr 21 02:50:40.506736 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:40.506715 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4n9dr"] Apr 21 02:50:40.510142 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:40.510121 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4n9dr"] Apr 21 02:50:41.622541 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:41.622507 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4ec63b-6271-440c-b545-953303b66d7c" path="/var/lib/kubelet/pods/be4ec63b-6271-440c-b545-953303b66d7c/volumes" Apr 21 02:50:53.196392 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.196355 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-x8k6q"] Apr 21 02:50:53.196876 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.196751 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be4ec63b-6271-440c-b545-953303b66d7c" containerName="authorino" Apr 21 02:50:53.196876 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.196764 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4ec63b-6271-440c-b545-953303b66d7c" containerName="authorino" Apr 21 02:50:53.196876 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.196818 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="be4ec63b-6271-440c-b545-953303b66d7c" containerName="authorino" Apr 21 02:50:53.201174 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.201147 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:50:53.203981 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.203955 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-gjxgv\"" Apr 21 02:50:53.204329 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.204312 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 21 02:50:53.210532 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.208193 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-x8k6q"] Apr 21 02:50:53.253806 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.253762 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55hls\" (UniqueName: \"kubernetes.io/projected/2c8c63fa-d388-4479-acac-a4d0d7584161-kube-api-access-55hls\") pod \"postgres-868db5846d-x8k6q\" (UID: \"2c8c63fa-d388-4479-acac-a4d0d7584161\") " pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:50:53.253942 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.253842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2c8c63fa-d388-4479-acac-a4d0d7584161-data\") pod \"postgres-868db5846d-x8k6q\" (UID: \"2c8c63fa-d388-4479-acac-a4d0d7584161\") " pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:50:53.355200 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.355146 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2c8c63fa-d388-4479-acac-a4d0d7584161-data\") pod \"postgres-868db5846d-x8k6q\" (UID: \"2c8c63fa-d388-4479-acac-a4d0d7584161\") " pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:50:53.355364 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.355254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55hls\" (UniqueName: \"kubernetes.io/projected/2c8c63fa-d388-4479-acac-a4d0d7584161-kube-api-access-55hls\") pod \"postgres-868db5846d-x8k6q\" (UID: \"2c8c63fa-d388-4479-acac-a4d0d7584161\") " pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:50:53.355527 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.355508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2c8c63fa-d388-4479-acac-a4d0d7584161-data\") pod \"postgres-868db5846d-x8k6q\" (UID: \"2c8c63fa-d388-4479-acac-a4d0d7584161\") " pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:50:53.362894 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.362869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55hls\" (UniqueName: \"kubernetes.io/projected/2c8c63fa-d388-4479-acac-a4d0d7584161-kube-api-access-55hls\") pod \"postgres-868db5846d-x8k6q\" (UID: \"2c8c63fa-d388-4479-acac-a4d0d7584161\") " pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:50:53.519190 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.519116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:50:53.637822 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:53.637759 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-x8k6q"] Apr 21 02:50:53.640143 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:50:53.640112 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c8c63fa_d388_4479_acac_a4d0d7584161.slice/crio-981812a65f40ae71d075b9ff550afc3f01939121f911e9fbb0e39cbdd7e03e54 WatchSource:0}: Error finding container 981812a65f40ae71d075b9ff550afc3f01939121f911e9fbb0e39cbdd7e03e54: Status 404 returned error can't find the container with id 981812a65f40ae71d075b9ff550afc3f01939121f911e9fbb0e39cbdd7e03e54 Apr 21 02:50:54.534596 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:50:54.534561 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-x8k6q" event={"ID":"2c8c63fa-d388-4479-acac-a4d0d7584161","Type":"ContainerStarted","Data":"981812a65f40ae71d075b9ff550afc3f01939121f911e9fbb0e39cbdd7e03e54"} Apr 21 02:51:00.556567 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:00.556532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-x8k6q" event={"ID":"2c8c63fa-d388-4479-acac-a4d0d7584161","Type":"ContainerStarted","Data":"290215a26f11742f597d1d6e02218a795483a953c979e257eacbc5b3233e3152"} Apr 21 02:51:00.556946 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:00.556655 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:51:00.572932 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:00.572885 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-x8k6q" podStartSLOduration=1.6861700069999999 podStartE2EDuration="7.572871971s" podCreationTimestamp="2026-04-21 02:50:53 +0000 UTC" firstStartedPulling="2026-04-21 02:50:53.641430421 +0000 UTC m=+586.575107476" lastFinishedPulling="2026-04-21 02:50:59.528132386 +0000 UTC m=+592.461809440" observedRunningTime="2026-04-21 02:51:00.571047762 +0000 UTC m=+593.504724837" watchObservedRunningTime="2026-04-21 02:51:00.572871971 +0000 UTC m=+593.506549093" Apr 21 02:51:06.588205 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:06.588177 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-x8k6q" Apr 21 02:51:07.485416 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:07.485386 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:51:07.485590 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:07.485423 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:51:35.886360 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:35.886277 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:51:35.889951 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:35.889929 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:51:35.892053 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:35.892030 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 02:51:35.892053 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:35.892043 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-fmljj\"" Apr 21 02:51:35.892214 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:35.892095 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 02:51:35.892755 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:35.892742 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 02:51:35.896055 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:35.896039 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:51:35.923077 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:35.923052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkx8k\" (UniqueName: \"kubernetes.io/projected/bd62f575-8690-4fdb-b574-4642e9a8cdda-kube-api-access-fkx8k\") pod \"maas-keycloak-0\" (UID: \"bd62f575-8690-4fdb-b574-4642e9a8cdda\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:51:36.024268 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:36.024223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkx8k\" (UniqueName: \"kubernetes.io/projected/bd62f575-8690-4fdb-b574-4642e9a8cdda-kube-api-access-fkx8k\") pod \"maas-keycloak-0\" (UID: \"bd62f575-8690-4fdb-b574-4642e9a8cdda\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:51:36.034437 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:36.034409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkx8k\" (UniqueName: \"kubernetes.io/projected/bd62f575-8690-4fdb-b574-4642e9a8cdda-kube-api-access-fkx8k\") pod \"maas-keycloak-0\" (UID: \"bd62f575-8690-4fdb-b574-4642e9a8cdda\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:51:36.200717 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:36.200678 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:51:36.317185 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:36.317160 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:51:36.670050 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:36.670017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"bd62f575-8690-4fdb-b574-4642e9a8cdda","Type":"ContainerStarted","Data":"50e7e8ee7192569e1e1f50fa027305fcaebdfd8c333428f2c53f92ca8bf15fe3"} Apr 21 02:51:41.692736 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:41.692686 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"bd62f575-8690-4fdb-b574-4642e9a8cdda","Type":"ContainerStarted","Data":"43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855"} Apr 21 02:51:41.710739 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:41.710679 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.8230740970000001 podStartE2EDuration="6.710658796s" podCreationTimestamp="2026-04-21 02:51:35 +0000 UTC" firstStartedPulling="2026-04-21 02:51:36.322283245 +0000 UTC m=+629.255960319" lastFinishedPulling="2026-04-21 02:51:41.209867962 +0000 UTC m=+634.143545018" observedRunningTime="2026-04-21 02:51:41.708505887 +0000 UTC m=+634.642182965" watchObservedRunningTime="2026-04-21 02:51:41.710658796 +0000 UTC m=+634.644335873" Apr 21 02:51:42.201328 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:42.201278 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 02:51:42.203262 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:42.203182 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:43.202262 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:43.202173 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:44.201225 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:44.201178 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:45.202184 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:45.202131 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:46.201468 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:46.201422 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 02:51:46.201733 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:46.201694 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:47.201793 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:47.201735 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:48.202012 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:48.201952 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:49.201376 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:49.201321 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:50.201296 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:50.201254 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:51.201371 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:51.201324 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:52.202114 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:52.202061 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:53.201623 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:53.201575 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:54.201831 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:54.201780 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.28:9000/health/started\": dial tcp 10.133.0.28:9000: connect: connection refused" Apr 21 02:51:55.321366 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:55.321329 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 02:51:55.342880 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:51:55.342829 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 02:52:05.327590 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:05.327556 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:06.393843 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.393802 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v2m48"] Apr 21 02:52:06.403507 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.403486 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-v2m48" Apr 21 02:52:06.407509 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.407483 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v2m48"] Apr 21 02:52:06.511123 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.511087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grlsk\" (UniqueName: \"kubernetes.io/projected/fa6368d8-3964-448c-bc7a-76c0c6f06d8d-kube-api-access-grlsk\") pod \"authorino-8b475cf9f-v2m48\" (UID: \"fa6368d8-3964-448c-bc7a-76c0c6f06d8d\") " pod="kuadrant-system/authorino-8b475cf9f-v2m48" Apr 21 02:52:06.611616 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.611579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grlsk\" (UniqueName: \"kubernetes.io/projected/fa6368d8-3964-448c-bc7a-76c0c6f06d8d-kube-api-access-grlsk\") pod \"authorino-8b475cf9f-v2m48\" (UID: \"fa6368d8-3964-448c-bc7a-76c0c6f06d8d\") " pod="kuadrant-system/authorino-8b475cf9f-v2m48" Apr 21 02:52:06.619926 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.619895 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grlsk\" (UniqueName: \"kubernetes.io/projected/fa6368d8-3964-448c-bc7a-76c0c6f06d8d-kube-api-access-grlsk\") pod \"authorino-8b475cf9f-v2m48\" (UID: \"fa6368d8-3964-448c-bc7a-76c0c6f06d8d\") " pod="kuadrant-system/authorino-8b475cf9f-v2m48" Apr 21 02:52:06.679125 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.679026 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v2m48"] Apr 21 02:52:06.679337 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.679297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-v2m48" Apr 21 02:52:06.705547 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.705517 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7778957d74-rl82q"] Apr 21 02:52:06.710560 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.710539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7778957d74-rl82q" Apr 21 02:52:06.714468 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.714448 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7778957d74-rl82q"] Apr 21 02:52:06.810386 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.810267 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v2m48"] Apr 21 02:52:06.813163 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.813132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lnlh\" (UniqueName: \"kubernetes.io/projected/587c8a22-1174-448a-8f81-bec00c904a34-kube-api-access-8lnlh\") pod \"authorino-7778957d74-rl82q\" (UID: \"587c8a22-1174-448a-8f81-bec00c904a34\") " pod="kuadrant-system/authorino-7778957d74-rl82q" Apr 21 02:52:06.813318 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:52:06.813201 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa6368d8_3964_448c_bc7a_76c0c6f06d8d.slice/crio-ad152640b80699cf3b4f11c712de949ccebe7e9e19242e1dfcc8137ecfd0951a WatchSource:0}: Error finding container ad152640b80699cf3b4f11c712de949ccebe7e9e19242e1dfcc8137ecfd0951a: Status 404 returned error can't find the container with id ad152640b80699cf3b4f11c712de949ccebe7e9e19242e1dfcc8137ecfd0951a Apr 21 02:52:06.914213 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.914180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lnlh\" (UniqueName: \"kubernetes.io/projected/587c8a22-1174-448a-8f81-bec00c904a34-kube-api-access-8lnlh\") pod \"authorino-7778957d74-rl82q\" (UID: \"587c8a22-1174-448a-8f81-bec00c904a34\") " pod="kuadrant-system/authorino-7778957d74-rl82q" Apr 21 02:52:06.921572 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.921553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lnlh\" (UniqueName: \"kubernetes.io/projected/587c8a22-1174-448a-8f81-bec00c904a34-kube-api-access-8lnlh\") pod \"authorino-7778957d74-rl82q\" (UID: \"587c8a22-1174-448a-8f81-bec00c904a34\") " pod="kuadrant-system/authorino-7778957d74-rl82q" Apr 21 02:52:06.960468 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.960395 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7778957d74-rl82q"] Apr 21 02:52:06.960647 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.960636 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7778957d74-rl82q" Apr 21 02:52:06.987438 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.987409 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f45bbf446-kqt6s"] Apr 21 02:52:06.991914 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.991893 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:52:06.993994 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.993971 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 02:52:06.998388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:06.997329 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f45bbf446-kqt6s"] Apr 21 02:52:07.081227 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.081192 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7778957d74-rl82q"] Apr 21 02:52:07.084961 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:52:07.084933 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod587c8a22_1174_448a_8f81_bec00c904a34.slice/crio-8995b5724ec730adddfb7662662dd94dcefd1fff5fadc48ed59f3931aab5f1d4 WatchSource:0}: Error finding container 8995b5724ec730adddfb7662662dd94dcefd1fff5fadc48ed59f3931aab5f1d4: Status 404 returned error can't find the container with id 8995b5724ec730adddfb7662662dd94dcefd1fff5fadc48ed59f3931aab5f1d4 Apr 21 02:52:07.115359 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.115331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-tls-cert\") pod \"authorino-f45bbf446-kqt6s\" (UID: \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\") " pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:52:07.115490 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.115377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlkdk\" (UniqueName: \"kubernetes.io/projected/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-kube-api-access-rlkdk\") pod \"authorino-f45bbf446-kqt6s\" (UID: \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\") " pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:52:07.215904 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.215820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-tls-cert\") pod \"authorino-f45bbf446-kqt6s\" (UID: \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\") " pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:52:07.215904 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.215887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlkdk\" (UniqueName: \"kubernetes.io/projected/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-kube-api-access-rlkdk\") pod \"authorino-f45bbf446-kqt6s\" (UID: \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\") " pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:52:07.218204 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.218184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-tls-cert\") pod \"authorino-f45bbf446-kqt6s\" (UID: \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\") " pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:52:07.223979 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.223957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlkdk\" (UniqueName: \"kubernetes.io/projected/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-kube-api-access-rlkdk\") pod \"authorino-f45bbf446-kqt6s\" (UID: \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\") " pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:52:07.303113 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.303077 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:52:07.440981 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.440917 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f45bbf446-kqt6s"] Apr 21 02:52:07.444028 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:52:07.444005 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cf11b8c_24bf_498f_853e_dc64fd7cca7f.slice/crio-501df13b1cb5b6ec33437b208ac054e5cc6588e686ff8682c06d2c9bd78fbe45 WatchSource:0}: Error finding container 501df13b1cb5b6ec33437b208ac054e5cc6588e686ff8682c06d2c9bd78fbe45: Status 404 returned error can't find the container with id 501df13b1cb5b6ec33437b208ac054e5cc6588e686ff8682c06d2c9bd78fbe45 Apr 21 02:52:07.797846 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.797752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7778957d74-rl82q" event={"ID":"587c8a22-1174-448a-8f81-bec00c904a34","Type":"ContainerStarted","Data":"236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71"} Apr 21 02:52:07.797846 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.797802 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7778957d74-rl82q" event={"ID":"587c8a22-1174-448a-8f81-bec00c904a34","Type":"ContainerStarted","Data":"8995b5724ec730adddfb7662662dd94dcefd1fff5fadc48ed59f3931aab5f1d4"} Apr 21 02:52:07.797846 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.797834 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7778957d74-rl82q" podUID="587c8a22-1174-448a-8f81-bec00c904a34" containerName="authorino" containerID="cri-o://236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71" gracePeriod=30 Apr 21 02:52:07.799282 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.799227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-v2m48" event={"ID":"fa6368d8-3964-448c-bc7a-76c0c6f06d8d","Type":"ContainerStarted","Data":"c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc"} Apr 21 02:52:07.799411 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.799291 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-v2m48" event={"ID":"fa6368d8-3964-448c-bc7a-76c0c6f06d8d","Type":"ContainerStarted","Data":"ad152640b80699cf3b4f11c712de949ccebe7e9e19242e1dfcc8137ecfd0951a"} Apr 21 02:52:07.799411 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.799303 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-v2m48" podUID="fa6368d8-3964-448c-bc7a-76c0c6f06d8d" containerName="authorino" containerID="cri-o://c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc" gracePeriod=30 Apr 21 02:52:07.800386 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.800362 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f45bbf446-kqt6s" event={"ID":"8cf11b8c-24bf-498f-853e-dc64fd7cca7f","Type":"ContainerStarted","Data":"501df13b1cb5b6ec33437b208ac054e5cc6588e686ff8682c06d2c9bd78fbe45"} Apr 21 02:52:07.816042 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.816000 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7778957d74-rl82q" podStartSLOduration=1.492377138 podStartE2EDuration="1.815979554s" podCreationTimestamp="2026-04-21 02:52:06 +0000 UTC" firstStartedPulling="2026-04-21 02:52:07.086350895 +0000 UTC m=+660.020027953" lastFinishedPulling="2026-04-21 02:52:07.409953312 +0000 UTC m=+660.343630369" observedRunningTime="2026-04-21 02:52:07.814141809 +0000 UTC m=+660.747818882" watchObservedRunningTime="2026-04-21 02:52:07.815979554 +0000 UTC m=+660.749656632" Apr 21 02:52:07.829026 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:07.828987 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-v2m48" podStartSLOduration=1.3237133669999999 podStartE2EDuration="1.828975669s" podCreationTimestamp="2026-04-21 02:52:06 +0000 UTC" firstStartedPulling="2026-04-21 02:52:06.814814881 +0000 UTC m=+659.748491935" lastFinishedPulling="2026-04-21 02:52:07.320077179 +0000 UTC m=+660.253754237" observedRunningTime="2026-04-21 02:52:07.827358883 +0000 UTC m=+660.761035959" watchObservedRunningTime="2026-04-21 02:52:07.828975669 +0000 UTC m=+660.762652746" Apr 21 02:52:08.058020 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.057992 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7778957d74-rl82q" Apr 21 02:52:08.067778 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.067755 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-v2m48" Apr 21 02:52:08.226018 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.225987 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lnlh\" (UniqueName: \"kubernetes.io/projected/587c8a22-1174-448a-8f81-bec00c904a34-kube-api-access-8lnlh\") pod \"587c8a22-1174-448a-8f81-bec00c904a34\" (UID: \"587c8a22-1174-448a-8f81-bec00c904a34\") " Apr 21 02:52:08.226178 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.226034 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grlsk\" (UniqueName: \"kubernetes.io/projected/fa6368d8-3964-448c-bc7a-76c0c6f06d8d-kube-api-access-grlsk\") pod \"fa6368d8-3964-448c-bc7a-76c0c6f06d8d\" (UID: \"fa6368d8-3964-448c-bc7a-76c0c6f06d8d\") " Apr 21 02:52:08.228160 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.228118 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587c8a22-1174-448a-8f81-bec00c904a34-kube-api-access-8lnlh" (OuterVolumeSpecName: "kube-api-access-8lnlh") pod "587c8a22-1174-448a-8f81-bec00c904a34" (UID: "587c8a22-1174-448a-8f81-bec00c904a34"). InnerVolumeSpecName "kube-api-access-8lnlh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:52:08.228160 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.228138 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6368d8-3964-448c-bc7a-76c0c6f06d8d-kube-api-access-grlsk" (OuterVolumeSpecName: "kube-api-access-grlsk") pod "fa6368d8-3964-448c-bc7a-76c0c6f06d8d" (UID: "fa6368d8-3964-448c-bc7a-76c0c6f06d8d"). InnerVolumeSpecName "kube-api-access-grlsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:52:08.326825 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.326761 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8lnlh\" (UniqueName: \"kubernetes.io/projected/587c8a22-1174-448a-8f81-bec00c904a34-kube-api-access-8lnlh\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:52:08.326825 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.326785 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grlsk\" (UniqueName: \"kubernetes.io/projected/fa6368d8-3964-448c-bc7a-76c0c6f06d8d-kube-api-access-grlsk\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:52:08.805020 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.804986 2572 generic.go:358] "Generic (PLEG): container finished" podID="587c8a22-1174-448a-8f81-bec00c904a34" containerID="236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71" exitCode=0 Apr 21 02:52:08.805470 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.805036 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7778957d74-rl82q" Apr 21 02:52:08.805470 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.805074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7778957d74-rl82q" event={"ID":"587c8a22-1174-448a-8f81-bec00c904a34","Type":"ContainerDied","Data":"236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71"} Apr 21 02:52:08.805470 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.805108 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7778957d74-rl82q" event={"ID":"587c8a22-1174-448a-8f81-bec00c904a34","Type":"ContainerDied","Data":"8995b5724ec730adddfb7662662dd94dcefd1fff5fadc48ed59f3931aab5f1d4"} Apr 21 02:52:08.805470 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.805132 2572 scope.go:117] "RemoveContainer" containerID="236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71" Apr 21 02:52:08.806352 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.806262 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa6368d8-3964-448c-bc7a-76c0c6f06d8d" containerID="c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc" exitCode=0 Apr 21 02:52:08.806352 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.806303 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-v2m48" Apr 21 02:52:08.806463 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.806343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-v2m48" event={"ID":"fa6368d8-3964-448c-bc7a-76c0c6f06d8d","Type":"ContainerDied","Data":"c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc"} Apr 21 02:52:08.806463 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.806376 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-v2m48" event={"ID":"fa6368d8-3964-448c-bc7a-76c0c6f06d8d","Type":"ContainerDied","Data":"ad152640b80699cf3b4f11c712de949ccebe7e9e19242e1dfcc8137ecfd0951a"} Apr 21 02:52:08.807825 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.807804 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f45bbf446-kqt6s" event={"ID":"8cf11b8c-24bf-498f-853e-dc64fd7cca7f","Type":"ContainerStarted","Data":"69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c"} Apr 21 02:52:08.817936 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.817599 2572 scope.go:117] "RemoveContainer" containerID="236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71" Apr 21 02:52:08.818075 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:52:08.818047 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71\": container with ID starting with 236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71 not found: ID does not exist" containerID="236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71" Apr 21 02:52:08.818139 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.818084 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71"} err="failed to get container status \"236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71\": rpc error: code = NotFound desc = could not find container \"236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71\": container with ID starting with 236bd1d95650ed3dbf22a5c42ed36ec9c69e24b32800cf15e22a3243b21eed71 not found: ID does not exist" Apr 21 02:52:08.818139 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.818115 2572 scope.go:117] "RemoveContainer" containerID="c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc" Apr 21 02:52:08.825743 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.825726 2572 scope.go:117] "RemoveContainer" containerID="c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc" Apr 21 02:52:08.825972 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:52:08.825954 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc\": container with ID starting with c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc not found: ID does not exist" containerID="c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc" Apr 21 02:52:08.826030 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.825983 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc"} err="failed to get container status \"c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc\": rpc error: code = NotFound desc = could not find container \"c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc\": container with ID starting with c8dba57f64758f2587717ca0d6d640b4afcf4c240df0e83f094853abe6883abc not found: ID does not exist" Apr 21 02:52:08.830839 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.830803 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f45bbf446-kqt6s" podStartSLOduration=2.356216269 podStartE2EDuration="2.830791862s" podCreationTimestamp="2026-04-21 02:52:06 +0000 UTC" firstStartedPulling="2026-04-21 02:52:07.445563438 +0000 UTC m=+660.379240498" lastFinishedPulling="2026-04-21 02:52:07.920139036 +0000 UTC m=+660.853816091" observedRunningTime="2026-04-21 02:52:08.830442054 +0000 UTC m=+661.764119132" watchObservedRunningTime="2026-04-21 02:52:08.830791862 +0000 UTC m=+661.764468938" Apr 21 02:52:08.848604 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.848576 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v2m48"] Apr 21 02:52:08.853038 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.853012 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-v2m48"] Apr 21 02:52:08.856094 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.856073 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-llntq"] Apr 21 02:52:08.856286 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.856266 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-llntq" podUID="f6af4a30-fd04-4a85-bec6-c6737b9bf95d" containerName="authorino" containerID="cri-o://cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132" gracePeriod=30 Apr 21 02:52:08.870070 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.870047 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7778957d74-rl82q"] Apr 21 02:52:08.874425 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:08.874213 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7778957d74-rl82q"] Apr 21 02:52:09.097400 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.097375 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-llntq" Apr 21 02:52:09.135574 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.135505 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-262dx\" (UniqueName: \"kubernetes.io/projected/f6af4a30-fd04-4a85-bec6-c6737b9bf95d-kube-api-access-262dx\") pod \"f6af4a30-fd04-4a85-bec6-c6737b9bf95d\" (UID: \"f6af4a30-fd04-4a85-bec6-c6737b9bf95d\") " Apr 21 02:52:09.137441 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.137411 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6af4a30-fd04-4a85-bec6-c6737b9bf95d-kube-api-access-262dx" (OuterVolumeSpecName: "kube-api-access-262dx") pod "f6af4a30-fd04-4a85-bec6-c6737b9bf95d" (UID: "f6af4a30-fd04-4a85-bec6-c6737b9bf95d"). InnerVolumeSpecName "kube-api-access-262dx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:52:09.236084 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.236051 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-262dx\" (UniqueName: \"kubernetes.io/projected/f6af4a30-fd04-4a85-bec6-c6737b9bf95d-kube-api-access-262dx\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:52:09.622555 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.622523 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587c8a22-1174-448a-8f81-bec00c904a34" path="/var/lib/kubelet/pods/587c8a22-1174-448a-8f81-bec00c904a34/volumes" Apr 21 02:52:09.622853 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.622841 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6368d8-3964-448c-bc7a-76c0c6f06d8d" path="/var/lib/kubelet/pods/fa6368d8-3964-448c-bc7a-76c0c6f06d8d/volumes" Apr 21 02:52:09.812472 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.812437 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6af4a30-fd04-4a85-bec6-c6737b9bf95d" containerID="cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132" exitCode=0 Apr 21 02:52:09.812872 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.812487 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-llntq" Apr 21 02:52:09.812872 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.812516 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-llntq" event={"ID":"f6af4a30-fd04-4a85-bec6-c6737b9bf95d","Type":"ContainerDied","Data":"cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132"} Apr 21 02:52:09.812872 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.812556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-llntq" event={"ID":"f6af4a30-fd04-4a85-bec6-c6737b9bf95d","Type":"ContainerDied","Data":"c26204f41770df1473fc1172c83e365b6c7f700ff62865d4a716d87a728baf21"} Apr 21 02:52:09.812872 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.812573 2572 scope.go:117] "RemoveContainer" containerID="cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132" Apr 21 02:52:09.820959 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.820938 2572 scope.go:117] "RemoveContainer" containerID="cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132" Apr 21 02:52:09.821211 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:52:09.821193 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132\": container with ID starting with cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132 not found: ID does not exist" containerID="cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132" Apr 21 02:52:09.821276 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.821225 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132"} err="failed to get container status \"cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132\": rpc error: code = NotFound desc = could not find container \"cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132\": container with ID starting with cf5d2e8e8d4c9b9240de18dce0081261842e9d498f1558e5ee92bfbb9cf6e132 not found: ID does not exist" Apr 21 02:52:09.835008 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.834983 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-llntq"] Apr 21 02:52:09.846971 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:09.846951 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-llntq"] Apr 21 02:52:11.621841 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:11.621811 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6af4a30-fd04-4a85-bec6-c6737b9bf95d" path="/var/lib/kubelet/pods/f6af4a30-fd04-4a85-bec6-c6737b9bf95d/volumes" Apr 21 02:52:15.794739 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.794706 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7895b8759-swqdz"] Apr 21 02:52:15.795106 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.795035 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa6368d8-3964-448c-bc7a-76c0c6f06d8d" containerName="authorino" Apr 21 02:52:15.795106 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.795046 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6368d8-3964-448c-bc7a-76c0c6f06d8d" containerName="authorino" Apr 21 02:52:15.795106 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.795060 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="587c8a22-1174-448a-8f81-bec00c904a34" containerName="authorino" Apr 21 02:52:15.795106 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.795065 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="587c8a22-1174-448a-8f81-bec00c904a34" containerName="authorino" Apr 21 02:52:15.795106 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.795078 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6af4a30-fd04-4a85-bec6-c6737b9bf95d" containerName="authorino" Apr 21 02:52:15.795106 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.795084 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6af4a30-fd04-4a85-bec6-c6737b9bf95d" containerName="authorino" Apr 21 02:52:15.795361 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.795139 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa6368d8-3964-448c-bc7a-76c0c6f06d8d" containerName="authorino" Apr 21 02:52:15.795361 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.795156 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6af4a30-fd04-4a85-bec6-c6737b9bf95d" containerName="authorino" Apr 21 02:52:15.795361 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.795165 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="587c8a22-1174-448a-8f81-bec00c904a34" containerName="authorino" Apr 21 02:52:15.799436 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.799418 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:15.801737 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.801713 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 02:52:15.801874 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.801769 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 02:52:15.801874 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.801771 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6jkpt\"" Apr 21 02:52:15.806013 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.805990 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7895b8759-swqdz"] Apr 21 02:52:15.892102 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.892070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1f2f53e8-92fe-4550-9e80-002632ae5c8c-maas-api-tls\") pod \"maas-api-7895b8759-swqdz\" (UID: \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\") " pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:15.892254 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.892143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfs77\" (UniqueName: \"kubernetes.io/projected/1f2f53e8-92fe-4550-9e80-002632ae5c8c-kube-api-access-jfs77\") pod \"maas-api-7895b8759-swqdz\" (UID: \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\") " pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:15.992681 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.992647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfs77\" (UniqueName: \"kubernetes.io/projected/1f2f53e8-92fe-4550-9e80-002632ae5c8c-kube-api-access-jfs77\") pod \"maas-api-7895b8759-swqdz\" (UID: \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\") " pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:15.992832 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.992710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1f2f53e8-92fe-4550-9e80-002632ae5c8c-maas-api-tls\") pod \"maas-api-7895b8759-swqdz\" (UID: \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\") " pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:15.995041 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:15.995017 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1f2f53e8-92fe-4550-9e80-002632ae5c8c-maas-api-tls\") pod \"maas-api-7895b8759-swqdz\" (UID: \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\") " pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:16.000384 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:16.000364 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfs77\" (UniqueName: \"kubernetes.io/projected/1f2f53e8-92fe-4550-9e80-002632ae5c8c-kube-api-access-jfs77\") pod \"maas-api-7895b8759-swqdz\" (UID: \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\") " pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:16.112424 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:16.112345 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:16.235087 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:16.235064 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7895b8759-swqdz"] Apr 21 02:52:16.237546 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:52:16.237520 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2f53e8_92fe_4550_9e80_002632ae5c8c.slice/crio-bc6bebb86a3b34f227aae1c41f89c3356c3b9c0b9fb50a8b2b64d2ee0d63e39b WatchSource:0}: Error finding container bc6bebb86a3b34f227aae1c41f89c3356c3b9c0b9fb50a8b2b64d2ee0d63e39b: Status 404 returned error can't find the container with id bc6bebb86a3b34f227aae1c41f89c3356c3b9c0b9fb50a8b2b64d2ee0d63e39b Apr 21 02:52:16.845048 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:16.845010 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7895b8759-swqdz" event={"ID":"1f2f53e8-92fe-4550-9e80-002632ae5c8c","Type":"ContainerStarted","Data":"bc6bebb86a3b34f227aae1c41f89c3356c3b9c0b9fb50a8b2b64d2ee0d63e39b"} Apr 21 02:52:18.854405 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:18.854368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7895b8759-swqdz" event={"ID":"1f2f53e8-92fe-4550-9e80-002632ae5c8c","Type":"ContainerStarted","Data":"678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c"} Apr 21 02:52:18.854774 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:18.854547 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:18.872554 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:18.872508 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7895b8759-swqdz" podStartSLOduration=1.845584196 podStartE2EDuration="3.872493024s" podCreationTimestamp="2026-04-21 02:52:15 +0000 UTC" firstStartedPulling="2026-04-21 02:52:16.239213455 +0000 UTC m=+669.172890510" lastFinishedPulling="2026-04-21 02:52:18.266122281 +0000 UTC m=+671.199799338" observedRunningTime="2026-04-21 02:52:18.870143009 +0000 UTC m=+671.803820085" watchObservedRunningTime="2026-04-21 02:52:18.872493024 +0000 UTC m=+671.806170100" Apr 21 02:52:24.862735 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:24.862706 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:52:39.673012 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:39.672972 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:52:39.673581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:39.673200 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" containerID="cri-o://43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855" gracePeriod=30 Apr 21 02:52:41.709438 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.709416 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:41.827381 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.827351 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkx8k\" (UniqueName: \"kubernetes.io/projected/bd62f575-8690-4fdb-b574-4642e9a8cdda-kube-api-access-fkx8k\") pod \"bd62f575-8690-4fdb-b574-4642e9a8cdda\" (UID: \"bd62f575-8690-4fdb-b574-4642e9a8cdda\") " Apr 21 02:52:41.829544 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.829517 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd62f575-8690-4fdb-b574-4642e9a8cdda-kube-api-access-fkx8k" (OuterVolumeSpecName: "kube-api-access-fkx8k") pod "bd62f575-8690-4fdb-b574-4642e9a8cdda" (UID: "bd62f575-8690-4fdb-b574-4642e9a8cdda"). InnerVolumeSpecName "kube-api-access-fkx8k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:52:41.928043 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.928014 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkx8k\" (UniqueName: \"kubernetes.io/projected/bd62f575-8690-4fdb-b574-4642e9a8cdda-kube-api-access-fkx8k\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:52:41.930113 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.930084 2572 generic.go:358] "Generic (PLEG): container finished" podID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerID="43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855" exitCode=143 Apr 21 02:52:41.930226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.930141 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:41.930226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.930152 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"bd62f575-8690-4fdb-b574-4642e9a8cdda","Type":"ContainerDied","Data":"43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855"} Apr 21 02:52:41.930226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.930178 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"bd62f575-8690-4fdb-b574-4642e9a8cdda","Type":"ContainerDied","Data":"50e7e8ee7192569e1e1f50fa027305fcaebdfd8c333428f2c53f92ca8bf15fe3"} Apr 21 02:52:41.930226 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.930193 2572 scope.go:117] "RemoveContainer" containerID="43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855" Apr 21 02:52:41.939069 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.939047 2572 scope.go:117] "RemoveContainer" containerID="43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855" Apr 21 02:52:41.939349 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:52:41.939329 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855\": container with ID starting with 43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855 not found: ID does not exist" containerID="43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855" Apr 21 02:52:41.939424 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.939355 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855"} err="failed to get container status \"43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855\": rpc error: code = NotFound desc = could not find container \"43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855\": container with ID starting with 43041144c7da534f249212bdbdf1198769e79a4369edbdcd046591143a79a855 not found: ID does not exist" Apr 21 02:52:41.951200 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.951179 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:52:41.955343 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.955323 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:52:41.976663 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.976644 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:52:41.977023 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.977007 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" Apr 21 02:52:41.977102 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.977027 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" Apr 21 02:52:41.977154 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.977116 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" containerName="keycloak" Apr 21 02:52:41.981367 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.981351 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:41.983698 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.983674 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 02:52:41.983698 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.983692 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-fmljj\"" Apr 21 02:52:41.983853 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.983716 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 21 02:52:41.983853 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.983692 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 02:52:41.983853 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.983765 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 02:52:41.987402 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:41.987382 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:52:42.129571 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.129536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chplp\" (UniqueName: \"kubernetes.io/projected/59637c79-7369-42f0-b139-0d677ad8f94e-kube-api-access-chplp\") pod \"maas-keycloak-0\" (UID: \"59637c79-7369-42f0-b139-0d677ad8f94e\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:42.129744 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.129654 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/59637c79-7369-42f0-b139-0d677ad8f94e-test-realms\") pod \"maas-keycloak-0\" (UID: \"59637c79-7369-42f0-b139-0d677ad8f94e\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:42.230471 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.230391 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/59637c79-7369-42f0-b139-0d677ad8f94e-test-realms\") pod \"maas-keycloak-0\" (UID: \"59637c79-7369-42f0-b139-0d677ad8f94e\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:42.230471 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.230429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chplp\" (UniqueName: \"kubernetes.io/projected/59637c79-7369-42f0-b139-0d677ad8f94e-kube-api-access-chplp\") pod \"maas-keycloak-0\" (UID: \"59637c79-7369-42f0-b139-0d677ad8f94e\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:42.231032 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.231012 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/59637c79-7369-42f0-b139-0d677ad8f94e-test-realms\") pod \"maas-keycloak-0\" (UID: \"59637c79-7369-42f0-b139-0d677ad8f94e\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:42.238577 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.238559 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chplp\" (UniqueName: \"kubernetes.io/projected/59637c79-7369-42f0-b139-0d677ad8f94e-kube-api-access-chplp\") pod \"maas-keycloak-0\" (UID: \"59637c79-7369-42f0-b139-0d677ad8f94e\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:42.291577 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.291547 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:42.409518 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.409494 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:52:42.412205 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:52:42.412178 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59637c79_7369_42f0_b139_0d677ad8f94e.slice/crio-9a3ac38d63b3e9d7b0f3c1fad6f9858d1e9db7f1c5332810d0b9817f1db421e2 WatchSource:0}: Error finding container 9a3ac38d63b3e9d7b0f3c1fad6f9858d1e9db7f1c5332810d0b9817f1db421e2: Status 404 returned error can't find the container with id 9a3ac38d63b3e9d7b0f3c1fad6f9858d1e9db7f1c5332810d0b9817f1db421e2 Apr 21 02:52:42.413450 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.413430 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:52:42.935744 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.935708 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"59637c79-7369-42f0-b139-0d677ad8f94e","Type":"ContainerStarted","Data":"ba0b2d4fd569f2b71c9b48b6f84b7532184791b045545aa59b350875582cd7c1"} Apr 21 02:52:42.935744 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.935748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"59637c79-7369-42f0-b139-0d677ad8f94e","Type":"ContainerStarted","Data":"9a3ac38d63b3e9d7b0f3c1fad6f9858d1e9db7f1c5332810d0b9817f1db421e2"} Apr 21 02:52:42.953117 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:42.953063 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.620222335 podStartE2EDuration="1.953050071s" podCreationTimestamp="2026-04-21 02:52:41 +0000 UTC" firstStartedPulling="2026-04-21 02:52:42.413557629 +0000 UTC m=+695.347234684" lastFinishedPulling="2026-04-21 02:52:42.746385363 +0000 UTC m=+695.680062420" observedRunningTime="2026-04-21 02:52:42.951874054 +0000 UTC m=+695.885551132" watchObservedRunningTime="2026-04-21 02:52:42.953050071 +0000 UTC m=+695.886727147" Apr 21 02:52:43.292603 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:43.292565 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:43.294082 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:43.294047 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:43.623347 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:43.623274 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd62f575-8690-4fdb-b574-4642e9a8cdda" path="/var/lib/kubelet/pods/bd62f575-8690-4fdb-b574-4642e9a8cdda/volumes" Apr 21 02:52:44.292686 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:44.292633 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:45.292933 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:45.292889 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:46.292772 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:46.292714 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:47.292706 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:47.292652 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:48.292326 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:48.292278 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:49.292391 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:49.292340 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:50.292043 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:50.291989 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:51.292854 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:51.292799 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:52.292299 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:52.292256 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:52.292676 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:52.292641 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:53.292777 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:53.292715 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:54.292621 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:54.292572 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:55.292434 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:55.292386 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="Get \"http://10.133.0.33:9000/health/started\": dial tcp 10.133.0.33:9000: connect: connection refused" Apr 21 02:52:56.420364 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:56.420315 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 02:52:56.435123 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:52:56.435080 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="59637c79-7369-42f0-b139-0d677ad8f94e" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 02:53:06.427400 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:06.427360 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 02:53:06.682171 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:06.682084 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7895b8759-swqdz"] Apr 21 02:53:06.682374 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:06.682333 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7895b8759-swqdz" podUID="1f2f53e8-92fe-4550-9e80-002632ae5c8c" containerName="maas-api" containerID="cri-o://678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c" gracePeriod=30 Apr 21 02:53:06.921584 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:06.921560 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:53:06.986553 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:06.986486 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfs77\" (UniqueName: \"kubernetes.io/projected/1f2f53e8-92fe-4550-9e80-002632ae5c8c-kube-api-access-jfs77\") pod \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\" (UID: \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\") " Apr 21 02:53:06.986553 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:06.986532 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1f2f53e8-92fe-4550-9e80-002632ae5c8c-maas-api-tls\") pod \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\" (UID: \"1f2f53e8-92fe-4550-9e80-002632ae5c8c\") " Apr 21 02:53:06.988695 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:06.988666 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2f53e8-92fe-4550-9e80-002632ae5c8c-kube-api-access-jfs77" (OuterVolumeSpecName: "kube-api-access-jfs77") pod "1f2f53e8-92fe-4550-9e80-002632ae5c8c" (UID: "1f2f53e8-92fe-4550-9e80-002632ae5c8c"). InnerVolumeSpecName "kube-api-access-jfs77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:53:06.988849 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:06.988726 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2f53e8-92fe-4550-9e80-002632ae5c8c-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "1f2f53e8-92fe-4550-9e80-002632ae5c8c" (UID: "1f2f53e8-92fe-4550-9e80-002632ae5c8c"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:53:07.030582 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.030553 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f2f53e8-92fe-4550-9e80-002632ae5c8c" containerID="678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c" exitCode=0 Apr 21 02:53:07.030705 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.030614 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7895b8759-swqdz" Apr 21 02:53:07.030705 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.030611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7895b8759-swqdz" event={"ID":"1f2f53e8-92fe-4550-9e80-002632ae5c8c","Type":"ContainerDied","Data":"678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c"} Apr 21 02:53:07.030705 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.030661 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7895b8759-swqdz" event={"ID":"1f2f53e8-92fe-4550-9e80-002632ae5c8c","Type":"ContainerDied","Data":"bc6bebb86a3b34f227aae1c41f89c3356c3b9c0b9fb50a8b2b64d2ee0d63e39b"} Apr 21 02:53:07.030705 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.030677 2572 scope.go:117] "RemoveContainer" containerID="678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c" Apr 21 02:53:07.038819 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.038800 2572 scope.go:117] "RemoveContainer" containerID="678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c" Apr 21 02:53:07.039082 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:53:07.039061 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c\": container with ID starting with 678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c not found: ID does not exist" containerID="678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c" Apr 21 02:53:07.039130 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.039089 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c"} err="failed to get container status \"678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c\": rpc error: code = NotFound desc = could not find container \"678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c\": container with ID starting with 678203f0e9f829694ef56cbbafeb92d70543c44eeb908f2590b4dd5bcf7bb66c not found: ID does not exist" Apr 21 02:53:07.052280 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.052258 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7895b8759-swqdz"] Apr 21 02:53:07.056703 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.056681 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7895b8759-swqdz"] Apr 21 02:53:07.087888 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.087865 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jfs77\" (UniqueName: \"kubernetes.io/projected/1f2f53e8-92fe-4550-9e80-002632ae5c8c-kube-api-access-jfs77\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:53:07.087888 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.087888 2572 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1f2f53e8-92fe-4550-9e80-002632ae5c8c-maas-api-tls\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:53:07.623990 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:07.623959 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2f53e8-92fe-4550-9e80-002632ae5c8c" path="/var/lib/kubelet/pods/1f2f53e8-92fe-4550-9e80-002632ae5c8c/volumes" Apr 21 02:53:17.095804 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.095773 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7cbf88f478-bxwvr"] Apr 21 02:53:17.096299 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.096109 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f2f53e8-92fe-4550-9e80-002632ae5c8c" containerName="maas-api" Apr 21 02:53:17.096299 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.096120 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2f53e8-92fe-4550-9e80-002632ae5c8c" containerName="maas-api" Apr 21 02:53:17.096299 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.096187 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f2f53e8-92fe-4550-9e80-002632ae5c8c" containerName="maas-api" Apr 21 02:53:17.100712 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.100691 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.102874 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.102850 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 21 02:53:17.106170 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.106147 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7cbf88f478-bxwvr"] Apr 21 02:53:17.186719 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.186683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trwln\" (UniqueName: \"kubernetes.io/projected/371aab43-a7bf-419e-bab5-9d60d475ac2c-kube-api-access-trwln\") pod \"authorino-7cbf88f478-bxwvr\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.186850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.186734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/371aab43-a7bf-419e-bab5-9d60d475ac2c-tls-cert\") pod \"authorino-7cbf88f478-bxwvr\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.186850 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.186805 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/371aab43-a7bf-419e-bab5-9d60d475ac2c-oidc-ca\") pod \"authorino-7cbf88f478-bxwvr\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.288013 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.287977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/371aab43-a7bf-419e-bab5-9d60d475ac2c-tls-cert\") pod \"authorino-7cbf88f478-bxwvr\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.288175 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.288043 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/371aab43-a7bf-419e-bab5-9d60d475ac2c-oidc-ca\") pod \"authorino-7cbf88f478-bxwvr\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.288175 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.288114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trwln\" (UniqueName: \"kubernetes.io/projected/371aab43-a7bf-419e-bab5-9d60d475ac2c-kube-api-access-trwln\") pod \"authorino-7cbf88f478-bxwvr\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.288756 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.288733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/371aab43-a7bf-419e-bab5-9d60d475ac2c-oidc-ca\") pod \"authorino-7cbf88f478-bxwvr\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.290437 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.290420 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/371aab43-a7bf-419e-bab5-9d60d475ac2c-tls-cert\") pod \"authorino-7cbf88f478-bxwvr\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.295324 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.295306 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trwln\" (UniqueName: \"kubernetes.io/projected/371aab43-a7bf-419e-bab5-9d60d475ac2c-kube-api-access-trwln\") pod \"authorino-7cbf88f478-bxwvr\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.411000 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.410971 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:53:17.528223 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:17.528083 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7cbf88f478-bxwvr"] Apr 21 02:53:17.530946 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:53:17.530911 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod371aab43_a7bf_419e_bab5_9d60d475ac2c.slice/crio-16d52f1f53c56fe39f6666ab3a48e085888981cd7979633c4c6f285601921a21 WatchSource:0}: Error finding container 16d52f1f53c56fe39f6666ab3a48e085888981cd7979633c4c6f285601921a21: Status 404 returned error can't find the container with id 16d52f1f53c56fe39f6666ab3a48e085888981cd7979633c4c6f285601921a21 Apr 21 02:53:18.067693 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:18.067661 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" event={"ID":"371aab43-a7bf-419e-bab5-9d60d475ac2c","Type":"ContainerStarted","Data":"16d52f1f53c56fe39f6666ab3a48e085888981cd7979633c4c6f285601921a21"} Apr 21 02:53:19.073325 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.073288 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" event={"ID":"371aab43-a7bf-419e-bab5-9d60d475ac2c","Type":"ContainerStarted","Data":"73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2"} Apr 21 02:53:19.088300 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.088251 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" podStartSLOduration=1.5122337190000001 podStartE2EDuration="2.08822379s" podCreationTimestamp="2026-04-21 02:53:17 +0000 UTC" firstStartedPulling="2026-04-21 02:53:17.53225676 +0000 UTC m=+730.465933813" lastFinishedPulling="2026-04-21 02:53:18.108246826 +0000 UTC m=+731.041923884" observedRunningTime="2026-04-21 02:53:19.086634857 +0000 UTC m=+732.020311934" watchObservedRunningTime="2026-04-21 02:53:19.08822379 +0000 UTC m=+732.021900910" Apr 21 02:53:19.111966 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.111938 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f45bbf446-kqt6s"] Apr 21 02:53:19.112149 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.112130 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f45bbf446-kqt6s" podUID="8cf11b8c-24bf-498f-853e-dc64fd7cca7f" containerName="authorino" containerID="cri-o://69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c" gracePeriod=30 Apr 21 02:53:19.356201 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.356176 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:53:19.507375 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.507342 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlkdk\" (UniqueName: \"kubernetes.io/projected/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-kube-api-access-rlkdk\") pod \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\" (UID: \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\") " Apr 21 02:53:19.507375 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.507380 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-tls-cert\") pod \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\" (UID: \"8cf11b8c-24bf-498f-853e-dc64fd7cca7f\") " Apr 21 02:53:19.509445 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.509414 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-kube-api-access-rlkdk" (OuterVolumeSpecName: "kube-api-access-rlkdk") pod "8cf11b8c-24bf-498f-853e-dc64fd7cca7f" (UID: "8cf11b8c-24bf-498f-853e-dc64fd7cca7f"). InnerVolumeSpecName "kube-api-access-rlkdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:53:19.517036 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.517009 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "8cf11b8c-24bf-498f-853e-dc64fd7cca7f" (UID: "8cf11b8c-24bf-498f-853e-dc64fd7cca7f"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:53:19.608267 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.608184 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rlkdk\" (UniqueName: \"kubernetes.io/projected/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-kube-api-access-rlkdk\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:53:19.608267 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:19.608207 2572 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8cf11b8c-24bf-498f-853e-dc64fd7cca7f-tls-cert\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:53:20.077786 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:20.077752 2572 generic.go:358] "Generic (PLEG): container finished" podID="8cf11b8c-24bf-498f-853e-dc64fd7cca7f" containerID="69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c" exitCode=0 Apr 21 02:53:20.078212 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:20.077804 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f45bbf446-kqt6s" Apr 21 02:53:20.078212 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:20.077844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f45bbf446-kqt6s" event={"ID":"8cf11b8c-24bf-498f-853e-dc64fd7cca7f","Type":"ContainerDied","Data":"69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c"} Apr 21 02:53:20.078212 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:20.077890 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f45bbf446-kqt6s" event={"ID":"8cf11b8c-24bf-498f-853e-dc64fd7cca7f","Type":"ContainerDied","Data":"501df13b1cb5b6ec33437b208ac054e5cc6588e686ff8682c06d2c9bd78fbe45"} Apr 21 02:53:20.078212 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:20.077914 2572 scope.go:117] "RemoveContainer" containerID="69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c" Apr 21 02:53:20.086098 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:20.086082 2572 scope.go:117] "RemoveContainer" containerID="69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c" Apr 21 02:53:20.086387 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:53:20.086363 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c\": container with ID starting with 69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c not found: ID does not exist" containerID="69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c" Apr 21 02:53:20.086459 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:20.086388 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c"} err="failed to get container status \"69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c\": rpc error: code = NotFound desc = could not find container \"69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c\": container with ID starting with 69f7c5bc09d49e46a30559953863d9d41704760e7a03f17c946a5c8e98c90c2c not found: ID does not exist" Apr 21 02:53:20.091834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:20.091814 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f45bbf446-kqt6s"] Apr 21 02:53:20.097282 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:20.097260 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f45bbf446-kqt6s"] Apr 21 02:53:21.622366 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:53:21.622337 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cf11b8c-24bf-498f-853e-dc64fd7cca7f" path="/var/lib/kubelet/pods/8cf11b8c-24bf-498f-853e-dc64fd7cca7f/volumes" Apr 21 02:54:58.682597 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.682568 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-ccb5685b7-lh666"] Apr 21 02:54:58.683026 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.682939 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cf11b8c-24bf-498f-853e-dc64fd7cca7f" containerName="authorino" Apr 21 02:54:58.683026 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.682952 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf11b8c-24bf-498f-853e-dc64fd7cca7f" containerName="authorino" Apr 21 02:54:58.683026 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.683012 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cf11b8c-24bf-498f-853e-dc64fd7cca7f" containerName="authorino" Apr 21 02:54:58.686129 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.686114 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.692716 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.692688 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-ccb5685b7-lh666"] Apr 21 02:54:58.754245 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.754213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj27r\" (UniqueName: \"kubernetes.io/projected/e7abe3da-fef2-4601-b656-59c818542974-kube-api-access-sj27r\") pod \"authorino-ccb5685b7-lh666\" (UID: \"e7abe3da-fef2-4601-b656-59c818542974\") " pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.754376 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.754292 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/e7abe3da-fef2-4601-b656-59c818542974-oidc-ca\") pod \"authorino-ccb5685b7-lh666\" (UID: \"e7abe3da-fef2-4601-b656-59c818542974\") " pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.754376 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.754357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e7abe3da-fef2-4601-b656-59c818542974-tls-cert\") pod \"authorino-ccb5685b7-lh666\" (UID: \"e7abe3da-fef2-4601-b656-59c818542974\") " pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.855200 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.855157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e7abe3da-fef2-4601-b656-59c818542974-tls-cert\") pod \"authorino-ccb5685b7-lh666\" (UID: \"e7abe3da-fef2-4601-b656-59c818542974\") " pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.855429 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.855268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj27r\" (UniqueName: \"kubernetes.io/projected/e7abe3da-fef2-4601-b656-59c818542974-kube-api-access-sj27r\") pod \"authorino-ccb5685b7-lh666\" (UID: \"e7abe3da-fef2-4601-b656-59c818542974\") " pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.855429 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.855311 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/e7abe3da-fef2-4601-b656-59c818542974-oidc-ca\") pod \"authorino-ccb5685b7-lh666\" (UID: \"e7abe3da-fef2-4601-b656-59c818542974\") " pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.855914 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.855889 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/e7abe3da-fef2-4601-b656-59c818542974-oidc-ca\") pod \"authorino-ccb5685b7-lh666\" (UID: \"e7abe3da-fef2-4601-b656-59c818542974\") " pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.857594 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.857574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e7abe3da-fef2-4601-b656-59c818542974-tls-cert\") pod \"authorino-ccb5685b7-lh666\" (UID: \"e7abe3da-fef2-4601-b656-59c818542974\") " pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.862563 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.862541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj27r\" (UniqueName: \"kubernetes.io/projected/e7abe3da-fef2-4601-b656-59c818542974-kube-api-access-sj27r\") pod \"authorino-ccb5685b7-lh666\" (UID: \"e7abe3da-fef2-4601-b656-59c818542974\") " pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:58.996063 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:58.995979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ccb5685b7-lh666" Apr 21 02:54:59.112613 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:59.112517 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-ccb5685b7-lh666"] Apr 21 02:54:59.115473 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:54:59.115445 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7abe3da_fef2_4601_b656_59c818542974.slice/crio-bfcd3ae3b109f82876b2e3820e404e1cf413d0436dec1e085b29c93cc99aa84d WatchSource:0}: Error finding container bfcd3ae3b109f82876b2e3820e404e1cf413d0436dec1e085b29c93cc99aa84d: Status 404 returned error can't find the container with id bfcd3ae3b109f82876b2e3820e404e1cf413d0436dec1e085b29c93cc99aa84d Apr 21 02:54:59.408817 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:54:59.408780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ccb5685b7-lh666" event={"ID":"e7abe3da-fef2-4601-b656-59c818542974","Type":"ContainerStarted","Data":"bfcd3ae3b109f82876b2e3820e404e1cf413d0436dec1e085b29c93cc99aa84d"} Apr 21 02:55:00.413607 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.413573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ccb5685b7-lh666" event={"ID":"e7abe3da-fef2-4601-b656-59c818542974","Type":"ContainerStarted","Data":"df4f007f85239abcdc1bca089d27a7288aca234dd4aa48d79e451af5240c6d2b"} Apr 21 02:55:00.481639 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.481578 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-ccb5685b7-lh666" podStartSLOduration=2.109684027 podStartE2EDuration="2.481562533s" podCreationTimestamp="2026-04-21 02:54:58 +0000 UTC" firstStartedPulling="2026-04-21 02:54:59.117162032 +0000 UTC m=+832.050839087" lastFinishedPulling="2026-04-21 02:54:59.489040539 +0000 UTC m=+832.422717593" observedRunningTime="2026-04-21 02:55:00.450728335 +0000 UTC m=+833.384405410" watchObservedRunningTime="2026-04-21 02:55:00.481562533 +0000 UTC m=+833.415239613" Apr 21 02:55:00.481834 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.481815 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7cbf88f478-bxwvr"] Apr 21 02:55:00.482072 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.482032 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" podUID="371aab43-a7bf-419e-bab5-9d60d475ac2c" containerName="authorino" containerID="cri-o://73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2" gracePeriod=30 Apr 21 02:55:00.726193 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.726169 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:55:00.872855 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.872825 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/371aab43-a7bf-419e-bab5-9d60d475ac2c-oidc-ca\") pod \"371aab43-a7bf-419e-bab5-9d60d475ac2c\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " Apr 21 02:55:00.873019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.872925 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/371aab43-a7bf-419e-bab5-9d60d475ac2c-tls-cert\") pod \"371aab43-a7bf-419e-bab5-9d60d475ac2c\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " Apr 21 02:55:00.873019 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.872948 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trwln\" (UniqueName: \"kubernetes.io/projected/371aab43-a7bf-419e-bab5-9d60d475ac2c-kube-api-access-trwln\") pod \"371aab43-a7bf-419e-bab5-9d60d475ac2c\" (UID: \"371aab43-a7bf-419e-bab5-9d60d475ac2c\") " Apr 21 02:55:00.874858 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.874827 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/371aab43-a7bf-419e-bab5-9d60d475ac2c-kube-api-access-trwln" (OuterVolumeSpecName: "kube-api-access-trwln") pod "371aab43-a7bf-419e-bab5-9d60d475ac2c" (UID: "371aab43-a7bf-419e-bab5-9d60d475ac2c"). InnerVolumeSpecName "kube-api-access-trwln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:55:00.877440 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.877412 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/371aab43-a7bf-419e-bab5-9d60d475ac2c-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "371aab43-a7bf-419e-bab5-9d60d475ac2c" (UID: "371aab43-a7bf-419e-bab5-9d60d475ac2c"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:55:00.883324 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.883304 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/371aab43-a7bf-419e-bab5-9d60d475ac2c-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "371aab43-a7bf-419e-bab5-9d60d475ac2c" (UID: "371aab43-a7bf-419e-bab5-9d60d475ac2c"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:55:00.974115 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.974046 2572 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/371aab43-a7bf-419e-bab5-9d60d475ac2c-tls-cert\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:55:00.974115 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.974070 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trwln\" (UniqueName: \"kubernetes.io/projected/371aab43-a7bf-419e-bab5-9d60d475ac2c-kube-api-access-trwln\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:55:00.974115 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:00.974080 2572 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/371aab43-a7bf-419e-bab5-9d60d475ac2c-oidc-ca\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 21 02:55:01.418108 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.418077 2572 generic.go:358] "Generic (PLEG): container finished" podID="371aab43-a7bf-419e-bab5-9d60d475ac2c" containerID="73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2" exitCode=0 Apr 21 02:55:01.418526 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.418130 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" Apr 21 02:55:01.418526 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.418166 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" event={"ID":"371aab43-a7bf-419e-bab5-9d60d475ac2c","Type":"ContainerDied","Data":"73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2"} Apr 21 02:55:01.418526 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.418203 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7cbf88f478-bxwvr" event={"ID":"371aab43-a7bf-419e-bab5-9d60d475ac2c","Type":"ContainerDied","Data":"16d52f1f53c56fe39f6666ab3a48e085888981cd7979633c4c6f285601921a21"} Apr 21 02:55:01.418526 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.418221 2572 scope.go:117] "RemoveContainer" containerID="73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2" Apr 21 02:55:01.427163 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.427139 2572 scope.go:117] "RemoveContainer" containerID="73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2" Apr 21 02:55:01.427412 ip-10-0-134-66 kubenswrapper[2572]: E0421 02:55:01.427396 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2\": container with ID starting with 73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2 not found: ID does not exist" containerID="73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2" Apr 21 02:55:01.427459 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.427421 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2"} err="failed to get container status \"73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2\": rpc error: code = NotFound desc = could not find container \"73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2\": container with ID starting with 73350d29278317b5dd4e81c463b0212ea42835c50d16bb73376d5d5a445f8cc2 not found: ID does not exist" Apr 21 02:55:01.437079 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.437059 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7cbf88f478-bxwvr"] Apr 21 02:55:01.447644 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.447624 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7cbf88f478-bxwvr"] Apr 21 02:55:01.624664 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:01.624635 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="371aab43-a7bf-419e-bab5-9d60d475ac2c" path="/var/lib/kubelet/pods/371aab43-a7bf-419e-bab5-9d60d475ac2c/volumes" Apr 21 02:55:39.956994 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:39.956963 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-ccb5685b7-lh666_e7abe3da-fef2-4601-b656-59c818542974/authorino/0.log" Apr 21 02:55:44.307787 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:44.307757 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f4d6bff-zcjlh_274dfc18-7aac-4617-8591-baab33d20f16/manager/0.log" Apr 21 02:55:44.415056 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:44.415027 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-x8k6q_2c8c63fa-d388-4479-acac-a4d0d7584161/postgres/0.log" Apr 21 02:55:45.773055 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:45.773023 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-ccb5685b7-lh666_e7abe3da-fef2-4601-b656-59c818542974/authorino/0.log" Apr 21 02:55:46.129765 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:46.129686 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-gsqfp_021bd13d-9c14-450d-a843-0d7a2e31b55d/kuadrant-console-plugin/0.log" Apr 21 02:55:47.158418 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:47.158390 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-55fc66fcf7-rd6z5_a90ecd6e-461f-481a-9742-b405e68ba59e/kube-auth-proxy/0.log" Apr 21 02:55:47.472965 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:47.472890 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bc899fb4-fbmpx_c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b/router/0.log" Apr 21 02:55:54.918616 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:54.918582 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-sn49d_8d474c84-b62f-4695-9dda-3d8d9e6aacb7/global-pull-secret-syncer/0.log" Apr 21 02:55:55.003457 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:55.003428 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2fslz_e34dd0ac-0e5c-4910-bc4e-288c8fff8b5d/konnectivity-agent/0.log" Apr 21 02:55:55.102348 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:55.102320 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-66.ec2.internal_19015d708a7f5256313024ebc4553800/haproxy/0.log" Apr 21 02:55:59.217039 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:59.217001 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-ccb5685b7-lh666_e7abe3da-fef2-4601-b656-59c818542974/authorino/0.log" Apr 21 02:55:59.310738 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:55:59.310703 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-gsqfp_021bd13d-9c14-450d-a843-0d7a2e31b55d/kuadrant-console-plugin/0.log" Apr 21 02:56:00.881572 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:00.881545 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eed554f9-0494-4890-85cd-17e7b666d556/alertmanager/0.log" Apr 21 02:56:00.915581 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:00.915554 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eed554f9-0494-4890-85cd-17e7b666d556/config-reloader/0.log" Apr 21 02:56:00.944745 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:00.944724 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eed554f9-0494-4890-85cd-17e7b666d556/kube-rbac-proxy-web/0.log" Apr 21 02:56:00.967576 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:00.967552 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eed554f9-0494-4890-85cd-17e7b666d556/kube-rbac-proxy/0.log" Apr 21 02:56:00.998388 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:00.998357 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eed554f9-0494-4890-85cd-17e7b666d556/kube-rbac-proxy-metric/0.log" Apr 21 02:56:01.021735 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.021672 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eed554f9-0494-4890-85cd-17e7b666d556/prom-label-proxy/0.log" Apr 21 02:56:01.044756 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.044740 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eed554f9-0494-4890-85cd-17e7b666d556/init-config-reloader/0.log" Apr 21 02:56:01.336690 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.336665 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9kctj_0a67cba2-1302-4cf5-a038-09168abcdd03/node-exporter/0.log" Apr 21 02:56:01.364542 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.364517 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9kctj_0a67cba2-1302-4cf5-a038-09168abcdd03/kube-rbac-proxy/0.log" Apr 21 02:56:01.386871 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.386849 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9kctj_0a67cba2-1302-4cf5-a038-09168abcdd03/init-textfile/0.log" Apr 21 02:56:01.594195 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.594168 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b56b5aa0-1d7a-4b5c-942a-f89eed682509/prometheus/0.log" Apr 21 02:56:01.615844 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.615819 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b56b5aa0-1d7a-4b5c-942a-f89eed682509/config-reloader/0.log" Apr 21 02:56:01.643031 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.643009 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b56b5aa0-1d7a-4b5c-942a-f89eed682509/thanos-sidecar/0.log" Apr 21 02:56:01.662843 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.662821 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b56b5aa0-1d7a-4b5c-942a-f89eed682509/kube-rbac-proxy-web/0.log" Apr 21 02:56:01.683083 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.683055 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b56b5aa0-1d7a-4b5c-942a-f89eed682509/kube-rbac-proxy/0.log" Apr 21 02:56:01.703691 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.703664 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b56b5aa0-1d7a-4b5c-942a-f89eed682509/kube-rbac-proxy-thanos/0.log" Apr 21 02:56:01.761780 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.761755 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b56b5aa0-1d7a-4b5c-942a-f89eed682509/init-config-reloader/0.log" Apr 21 02:56:01.898977 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:01.898952 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-dbh5m_fd0a03f0-ed39-4a9c-9384-114d2033c596/prometheus-operator-admission-webhook/0.log" Apr 21 02:56:02.031063 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:02.031037 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c454b744c-66tqf_e4f9224e-a677-484e-bef8-d062d8fca5c5/thanos-query/0.log" Apr 21 02:56:02.057080 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:02.057050 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c454b744c-66tqf_e4f9224e-a677-484e-bef8-d062d8fca5c5/kube-rbac-proxy-web/0.log" Apr 21 02:56:02.080158 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:02.080137 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c454b744c-66tqf_e4f9224e-a677-484e-bef8-d062d8fca5c5/kube-rbac-proxy/0.log" Apr 21 02:56:02.100533 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:02.100512 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c454b744c-66tqf_e4f9224e-a677-484e-bef8-d062d8fca5c5/prom-label-proxy/0.log" Apr 21 02:56:02.122918 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:02.122900 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c454b744c-66tqf_e4f9224e-a677-484e-bef8-d062d8fca5c5/kube-rbac-proxy-rules/0.log" Apr 21 02:56:02.140623 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:02.140605 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c454b744c-66tqf_e4f9224e-a677-484e-bef8-d062d8fca5c5/kube-rbac-proxy-metrics/0.log" Apr 21 02:56:03.578565 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.578528 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl"] Apr 21 02:56:03.579072 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.579055 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="371aab43-a7bf-419e-bab5-9d60d475ac2c" containerName="authorino" Apr 21 02:56:03.579116 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.579074 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="371aab43-a7bf-419e-bab5-9d60d475ac2c" containerName="authorino" Apr 21 02:56:03.579158 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.579126 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="371aab43-a7bf-419e-bab5-9d60d475ac2c" containerName="authorino" Apr 21 02:56:03.582061 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.582046 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.584265 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.584227 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjwp\"/\"openshift-service-ca.crt\"" Apr 21 02:56:03.584417 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.584400 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmjwp\"/\"default-dockercfg-rx8jc\"" Apr 21 02:56:03.585036 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.585018 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjwp\"/\"kube-root-ca.crt\"" Apr 21 02:56:03.588377 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.588355 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl"] Apr 21 02:56:03.714586 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.714555 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-lib-modules\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.714586 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.714588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-proc\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.714833 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.714628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-podres\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.714833 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.714651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-sys\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.714833 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.714791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p7fs\" (UniqueName: \"kubernetes.io/projected/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-kube-api-access-4p7fs\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.807558 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.807532 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:56:03.811694 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.811677 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/2.log" Apr 21 02:56:03.816127 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.816108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p7fs\" (UniqueName: \"kubernetes.io/projected/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-kube-api-access-4p7fs\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.816192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.816161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-lib-modules\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.816192 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.816186 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-proc\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.816293 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.816214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-podres\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.816293 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.816260 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-sys\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.816375 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.816322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-proc\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.816375 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.816363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-lib-modules\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.816449 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.816363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-sys\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.816449 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.816381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-podres\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.823453 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.823431 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p7fs\" (UniqueName: \"kubernetes.io/projected/29ca8bc2-daf5-4e93-8a99-51dfafb50ca6-kube-api-access-4p7fs\") pod \"perf-node-gather-daemonset-p9cfl\" (UID: \"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:03.893931 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:03.893907 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:04.011127 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:04.011104 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl"] Apr 21 02:56:04.013296 ip-10-0-134-66 kubenswrapper[2572]: W0421 02:56:04.013269 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod29ca8bc2_daf5_4e93_8a99_51dfafb50ca6.slice/crio-2a38b4b9dc2b6c3711a6b18f9e0f0742e6f573bb699fbb70d8ba5253589de824 WatchSource:0}: Error finding container 2a38b4b9dc2b6c3711a6b18f9e0f0742e6f573bb699fbb70d8ba5253589de824: Status 404 returned error can't find the container with id 2a38b4b9dc2b6c3711a6b18f9e0f0742e6f573bb699fbb70d8ba5253589de824 Apr 21 02:56:04.627680 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:04.627640 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" event={"ID":"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6","Type":"ContainerStarted","Data":"b0af49fe61b17cbd4b245e0bb985964b2d873e89eb1c9cb204e26771ac25bd0e"} Apr 21 02:56:04.627680 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:04.627677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" event={"ID":"29ca8bc2-daf5-4e93-8a99-51dfafb50ca6","Type":"ContainerStarted","Data":"2a38b4b9dc2b6c3711a6b18f9e0f0742e6f573bb699fbb70d8ba5253589de824"} Apr 21 02:56:04.628105 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:04.627717 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:04.644718 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:04.644673 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" podStartSLOduration=1.644660612 podStartE2EDuration="1.644660612s" podCreationTimestamp="2026-04-21 02:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:56:04.642924562 +0000 UTC m=+897.576601637" watchObservedRunningTime="2026-04-21 02:56:04.644660612 +0000 UTC m=+897.578337687" Apr 21 02:56:05.662969 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:05.662944 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5lb4v_7d9a58cd-c703-4352-b873-ebc3e5cc1cfd/dns/0.log" Apr 21 02:56:05.682611 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:05.682592 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5lb4v_7d9a58cd-c703-4352-b873-ebc3e5cc1cfd/kube-rbac-proxy/0.log" Apr 21 02:56:05.723753 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:05.723732 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m26kz_55cefcc2-8412-4791-ab29-e4fbdd117f4a/dns-node-resolver/0.log" Apr 21 02:56:06.192454 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:06.192428 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7bb69dcc45-djkbs_515fdc09-dc3e-4b87-a3c5-8db3f15b342f/registry/0.log" Apr 21 02:56:06.233872 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:06.233846 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vk9x8_17e20017-572f-4919-8639-2e7007feee0b/node-ca/0.log" Apr 21 02:56:07.173666 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:07.173638 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-55fc66fcf7-rd6z5_a90ecd6e-461f-481a-9742-b405e68ba59e/kube-auth-proxy/0.log" Apr 21 02:56:07.248999 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:07.248972 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bc899fb4-fbmpx_c8b0f00e-ebfb-4a3b-b87d-ccdff2c64c2b/router/0.log" Apr 21 02:56:07.509406 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:07.509316 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:56:07.510380 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:07.510354 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8jgxr_9246de05-b865-4564-bf19-9e73a72a4969/console-operator/1.log" Apr 21 02:56:07.729371 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:07.729346 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pzlnx_b386dadb-05b0-41e3-8db6-4a3771883f69/serve-healthcheck-canary/0.log" Apr 21 02:56:08.295359 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:08.295334 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2652q_dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f/kube-rbac-proxy/0.log" Apr 21 02:56:08.314286 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:08.314263 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2652q_dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f/exporter/0.log" Apr 21 02:56:08.334358 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:08.334337 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2652q_dccfeac3-9f28-4ca2-9fc1-95181bb0ba0f/extractor/0.log" Apr 21 02:56:10.456775 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:10.456728 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f4d6bff-zcjlh_274dfc18-7aac-4617-8591-baab33d20f16/manager/0.log" Apr 21 02:56:10.478402 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:10.478365 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-x8k6q_2c8c63fa-d388-4479-acac-a4d0d7584161/postgres/0.log" Apr 21 02:56:10.641074 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:10.641045 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-p9cfl" Apr 21 02:56:11.708317 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:11.708279 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-64dc57f969-bmrzh_67ff259c-a10b-4e74-960e-826ad8e8f7d8/manager/0.log" Apr 21 02:56:16.601133 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:16.601104 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z9799_b7aaabb9-fee1-4cd6-9c82-badd547250ae/kube-storage-version-migrator-operator/1.log" Apr 21 02:56:16.602141 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:16.602121 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z9799_b7aaabb9-fee1-4cd6-9c82-badd547250ae/kube-storage-version-migrator-operator/0.log" Apr 21 02:56:17.783371 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:17.783341 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h5dpc_ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f/kube-multus-additional-cni-plugins/0.log" Apr 21 02:56:17.803014 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:17.802989 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h5dpc_ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f/egress-router-binary-copy/0.log" Apr 21 02:56:17.822150 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:17.822131 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h5dpc_ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f/cni-plugins/0.log" Apr 21 02:56:17.842980 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:17.842959 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h5dpc_ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f/bond-cni-plugin/0.log" Apr 21 02:56:17.864382 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:17.864362 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h5dpc_ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f/routeoverride-cni/0.log" Apr 21 02:56:17.886066 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:17.886042 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h5dpc_ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f/whereabouts-cni-bincopy/0.log" Apr 21 02:56:17.906564 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:17.906539 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-h5dpc_ee8dd925-7241-4c8b-a7b6-bcbf8bab3a8f/whereabouts-cni/0.log" Apr 21 02:56:18.100008 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:18.099943 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rb6n9_d4201ca3-b268-4f22-89b7-f74f860bac2e/kube-multus/0.log" Apr 21 02:56:18.152730 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:18.152706 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2f9pd_f3ca7174-0a17-4896-b723-717a079d23e3/network-metrics-daemon/0.log" Apr 21 02:56:18.170484 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:18.170454 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2f9pd_f3ca7174-0a17-4896-b723-717a079d23e3/kube-rbac-proxy/0.log" Apr 21 02:56:19.110757 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:19.110733 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckmv8_9c680a54-5b50-40b0-b0c3-514ce8751675/ovn-controller/0.log" Apr 21 02:56:19.131313 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:19.131291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckmv8_9c680a54-5b50-40b0-b0c3-514ce8751675/ovn-acl-logging/0.log" Apr 21 02:56:19.153372 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:19.153348 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckmv8_9c680a54-5b50-40b0-b0c3-514ce8751675/kube-rbac-proxy-node/0.log" Apr 21 02:56:19.174120 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:19.174095 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckmv8_9c680a54-5b50-40b0-b0c3-514ce8751675/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 02:56:19.194343 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:19.194325 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckmv8_9c680a54-5b50-40b0-b0c3-514ce8751675/northd/0.log" Apr 21 02:56:19.213129 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:19.213104 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckmv8_9c680a54-5b50-40b0-b0c3-514ce8751675/nbdb/0.log" Apr 21 02:56:19.232938 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:19.232905 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckmv8_9c680a54-5b50-40b0-b0c3-514ce8751675/sbdb/0.log" Apr 21 02:56:19.338296 ip-10-0-134-66 kubenswrapper[2572]: I0421 02:56:19.338263 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckmv8_9c680a54-5b50-40b0-b0c3-514ce8751675/ovnkube-controller/0.log"